Dec 04 12:13:15 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 12:13:15 crc restorecon[4673]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:15 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 12:13:16 crc restorecon[4673]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 12:13:17 crc kubenswrapper[4760]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.695554 4760 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698247 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698262 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698267 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698271 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698274 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698278 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698282 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698285 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698289 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698292 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698296 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698299 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698304 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698308 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698317 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698322 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698326 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698329 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698332 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698336 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698340 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698344 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698347 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698351 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698354 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698358 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698361 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698365 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698368 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698372 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698375 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698378 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698382 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698385 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698389 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698392 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698396 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698400 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698403 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698406 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698410 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698413 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698417 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698422 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698426 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698431 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698435 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698439 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698443 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698447 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698450 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698454 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698458 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698461 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698465 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698468 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698472 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698475 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698478 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698482 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698487 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698491 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698496 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698499 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698503 4760 feature_gate.go:330] unrecognized feature gate: Example Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698507 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698512 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698516 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698519 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698523 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.698526 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698604 4760 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698613 4760 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698621 4760 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698627 4760 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698632 4760 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698638 4760 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698644 4760 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698650 4760 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698654 4760 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698658 4760 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698663 4760 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698667 4760 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698671 4760 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698675 4760 flags.go:64] FLAG: --cgroup-root="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698679 4760 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698683 4760 flags.go:64] FLAG: --client-ca-file="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698687 4760 flags.go:64] FLAG: --cloud-config="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698691 4760 flags.go:64] FLAG: --cloud-provider="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698695 4760 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698700 4760 flags.go:64] FLAG: --cluster-domain="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698704 4760 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698708 4760 flags.go:64] FLAG: --config-dir="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698713 4760 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698718 4760 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698723 4760 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698728 4760 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698732 4760 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698737 4760 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698741 4760 flags.go:64] FLAG: --contention-profiling="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698745 4760 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698749 4760 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698753 4760 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698757 4760 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698762 4760 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698767 4760 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698771 4760 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698776 4760 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698781 4760 flags.go:64] FLAG: --enable-server="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698785 4760 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698791 4760 flags.go:64] FLAG: --event-burst="100" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698796 4760 flags.go:64] FLAG: --event-qps="50" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698800 4760 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698804 4760 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698808 4760 flags.go:64] FLAG: --eviction-hard="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698817 4760 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698821 4760 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698826 4760 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698831 4760 flags.go:64] FLAG: --eviction-soft="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698835 4760 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698839 4760 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698843 4760 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698847 4760 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698851 4760 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698856 4760 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698860 4760 flags.go:64] FLAG: --feature-gates="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698870 4760 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698874 4760 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698878 4760 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698882 4760 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698888 4760 flags.go:64] FLAG: --healthz-port="10248" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698892 4760 flags.go:64] FLAG: --help="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698896 4760 flags.go:64] FLAG: --hostname-override="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698900 4760 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698905 4760 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698908 4760 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698913 4760 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698917 4760 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698921 4760 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698925 4760 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698929 4760 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698933 4760 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698938 4760 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698942 4760 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698946 4760 flags.go:64] FLAG: --kube-reserved="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698950 4760 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698954 4760 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698958 4760 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698962 4760 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698966 4760 flags.go:64] FLAG: --lock-file="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698970 4760 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698974 4760 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698978 4760 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698985 4760 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698989 4760 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698994 4760 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.698998 4760 flags.go:64] FLAG: --logging-format="text" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699002 4760 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699006 4760 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699010 4760 flags.go:64] FLAG: --manifest-url="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699014 4760 flags.go:64] FLAG: --manifest-url-header="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699019 4760 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699023 4760 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699029 4760 flags.go:64] FLAG: --max-pods="110" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699033 4760 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699037 4760 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699041 4760 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699045 4760 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699049 4760 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699053 4760 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699057 4760 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699067 4760 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699072 4760 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699077 4760 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699081 4760 flags.go:64] FLAG: --pod-cidr="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699085 4760 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699092 4760 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699096 4760 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699100 4760 flags.go:64] FLAG: --pods-per-core="0" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699104 4760 flags.go:64] FLAG: --port="10250" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699108 4760 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699112 4760 flags.go:64] FLAG: --provider-id="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699116 4760 flags.go:64] FLAG: --qos-reserved="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699120 4760 flags.go:64] FLAG: --read-only-port="10255" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699124 4760 flags.go:64] FLAG: --register-node="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699128 4760 flags.go:64] FLAG: --register-schedulable="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699132 4760 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699139 4760 flags.go:64] FLAG: --registry-burst="10" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699143 4760 flags.go:64] FLAG: --registry-qps="5" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699147 4760 flags.go:64] FLAG: --reserved-cpus="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699151 4760 flags.go:64] FLAG: --reserved-memory="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699156 4760 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699160 4760 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699165 4760 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699169 4760 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699173 4760 flags.go:64] FLAG: --runonce="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699177 4760 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699181 4760 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699185 4760 flags.go:64] FLAG: --seccomp-default="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699189 4760 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699237 4760 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699242 4760 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699246 4760 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699251 4760 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699256 4760 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699260 4760 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699264 4760 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699268 4760 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699272 4760 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699276 4760 flags.go:64] FLAG: --system-cgroups="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699280 4760 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699286 4760 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699290 4760 flags.go:64] FLAG: --tls-cert-file="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699294 4760 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699299 4760 flags.go:64] FLAG: --tls-min-version="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699303 4760 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699307 4760 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699311 4760 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699315 4760 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699320 4760 flags.go:64] FLAG: --v="2" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699325 4760 flags.go:64] FLAG: --version="false" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699330 4760 flags.go:64] FLAG: --vmodule="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699337 4760 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699341 4760 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699441 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699447 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699451 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699455 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699459 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699463 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699466 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699470 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699475 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699479 4760 feature_gate.go:330] unrecognized feature gate: Example Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699483 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699487 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699491 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699495 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699498 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699502 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699506 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699510 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699514 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699517 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699521 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699525 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699528 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699532 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699535 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699539 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699545 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699549 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699553 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699556 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699560 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699563 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699566 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699570 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699574 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699577 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699580 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699584 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699587 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699592 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699596 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699600 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699604 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699608 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699612 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699616 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699620 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699623 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699627 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699631 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699636 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699639 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699644 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699648 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699652 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699655 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699660 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699664 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699669 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699673 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699676 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699680 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699683 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699687 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699690 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699693 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699697 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699700 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699704 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699707 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.699711 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.699723 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.707102 4760 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.707136 4760 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707291 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707302 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707306 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707311 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707314 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707318 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707322 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707325 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707329 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707332 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707337 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707341 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707345 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707349 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707353 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707356 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707360 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707364 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707368 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707371 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707375 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707379 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707383 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707387 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707391 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707395 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707400 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707406 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707413 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707418 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707423 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707427 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707432 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707436 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707442 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707446 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707450 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707454 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707458 4760 feature_gate.go:330] unrecognized feature gate: Example Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707462 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707466 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707471 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707475 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707480 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707484 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707488 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707492 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707497 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707501 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707506 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707510 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707514 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707518 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707523 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707528 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707532 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707535 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707539 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707543 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707548 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707552 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707555 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707559 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707562 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707566 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707569 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707573 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707576 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707581 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707585 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707589 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.707595 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707724 4760 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707736 4760 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707741 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707745 4760 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707750 4760 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707753 4760 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707757 4760 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707762 4760 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707768 4760 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707773 4760 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707779 4760 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707782 4760 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707786 4760 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707790 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707794 4760 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707799 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707802 4760 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707806 4760 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707810 4760 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707813 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707817 4760 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707820 4760 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707824 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707828 4760 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707831 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707834 4760 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707838 4760 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707841 4760 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707844 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707848 4760 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707851 4760 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707855 4760 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707858 4760 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707863 4760 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707868 4760 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707871 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707875 4760 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707879 4760 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707882 4760 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707886 4760 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707889 4760 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707893 4760 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707896 4760 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707900 4760 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707904 4760 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707907 4760 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707910 4760 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707914 4760 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707917 4760 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707920 4760 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707925 4760 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707928 4760 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707932 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707936 4760 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707941 4760 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707945 4760 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707948 4760 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707952 4760 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707955 4760 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707958 4760 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707962 4760 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707966 4760 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707969 4760 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707972 4760 feature_gate.go:330] unrecognized feature gate: Example Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707976 4760 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707980 4760 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707983 4760 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707987 4760 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707990 4760 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707994 4760 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.707998 4760 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.708004 4760 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.708480 4760 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.713836 4760 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.713938 4760 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.714449 4760 server.go:997] "Starting client certificate rotation" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.714475 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.714884 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-29 06:09:36.422566001 +0000 UTC Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.715143 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.729973 4760 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.731835 4760 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.732140 4760 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.744320 4760 log.go:25] "Validated CRI v1 runtime API" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.772984 4760 log.go:25] "Validated CRI v1 image API" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.774910 4760 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.776982 4760 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-12-08-59-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.777013 4760 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.791397 4760 manager.go:217] Machine: {Timestamp:2025-12-04 12:13:17.788564284 +0000 UTC m=+0.830010871 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c3d842b6-e196-432f-8258-ff304cd02e6f BootID:f9c326ab-1318-43cc-ac8e-7cfd64c1e669 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4d:29:c1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4d:29:c1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a5:be:75 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d9:ae:54 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b1:dd:17 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b5:86:2f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:32:e7:f2:2a:6c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:e0:65:72:cc:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.791776 4760 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792010 4760 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792401 4760 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792611 4760 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792653 4760 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792896 4760 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.792909 4760 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.793136 4760 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.793171 4760 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.793559 4760 state_mem.go:36] "Initialized new in-memory state store" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.793671 4760 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.794331 4760 kubelet.go:418] "Attempting to sync node with API server" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.794359 4760 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.794389 4760 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.794402 4760 kubelet.go:324] "Adding apiserver pod source" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.794417 4760 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.796472 4760 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.797061 4760 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.800109 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.800222 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.800182 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.800299 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.800978 4760 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801650 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801682 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801691 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801701 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801714 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801722 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801729 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801741 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801750 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801758 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801780 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801788 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.801989 4760 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.803787 4760 server.go:1280] "Started kubelet" Dec 04 12:13:17 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.804897 4760 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.804958 4760 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.807981 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.809605 4760 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.810630 4760 server.go:460] "Adding debug handlers to kubelet server" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.811823 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.811875 4760 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.811939 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:40:15.944336605 +0000 UTC Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.811961 4760 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1024h26m58.132377984s for next certificate rotation Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.813158 4760 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.813258 4760 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.813371 4760 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.813159 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.813964 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.814027 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.815196 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="200ms" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.816914 4760 factory.go:55] Registering systemd factory Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.816939 4760 factory.go:221] Registration of the systemd container factory successfully Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.817292 4760 factory.go:153] Registering CRI-O factory Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.817302 4760 factory.go:221] Registration of the crio container factory successfully Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.817389 4760 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.817414 4760 factory.go:103] Registering Raw factory Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.817428 4760 manager.go:1196] Started watching for new ooms in manager Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.817247 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e021612ccab66 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 12:13:17.803752294 +0000 UTC m=+0.845198861,LastTimestamp:2025-12-04 12:13:17.803752294 +0000 UTC m=+0.845198861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.818365 4760 manager.go:319] Starting recovery of all containers Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832227 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832319 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832334 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832346 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832357 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832368 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832382 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832397 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832415 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832428 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832440 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832457 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832477 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832506 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832520 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832532 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832547 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832559 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832577 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832592 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832605 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832618 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832635 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832654 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832667 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832679 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832718 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832736 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832749 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832762 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832774 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832788 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832800 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832813 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832823 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832836 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832845 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832855 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832866 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832877 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832888 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832898 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832909 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832925 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832938 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832953 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832966 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832979 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.832989 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833000 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833009 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833019 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833035 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833045 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833058 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833069 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833079 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833096 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833108 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833121 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833130 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833142 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833152 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833161 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833171 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833183 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833193 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833203 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833240 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833255 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833266 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833276 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833287 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833297 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833328 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833340 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833350 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833360 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833369 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833380 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833390 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833401 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833433 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833444 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833455 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833464 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833478 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833487 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833496 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833505 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833514 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833529 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833552 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833564 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833574 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833587 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833600 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833613 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833627 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833640 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833652 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833665 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833679 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833691 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833710 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833726 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833741 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833754 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833764 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833776 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833787 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833796 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833808 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833818 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833828 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833846 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833857 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833867 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833878 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833888 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833898 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833909 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833920 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833931 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833941 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833951 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833960 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833969 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833978 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833988 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.833997 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834025 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834034 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834044 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834056 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834066 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834076 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834088 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834096 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834105 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834113 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834122 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834130 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834139 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834148 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834158 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834169 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834179 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834188 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834197 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834233 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834252 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834262 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834271 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834280 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834290 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834300 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834310 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834321 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834330 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834341 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834352 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834363 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834373 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834382 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834391 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834399 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834408 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834417 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834429 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834438 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834448 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834457 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834465 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834474 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834483 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834491 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834500 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834510 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834519 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834527 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834536 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834564 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834575 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834584 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834593 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834603 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834614 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834623 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834631 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.834640 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836768 4760 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836794 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836806 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836818 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836836 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836852 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836868 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836882 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836893 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836906 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836919 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836930 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836941 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836950 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836959 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836969 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836978 4760 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836987 4760 reconstruct.go:97] "Volume reconstruction finished" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.836994 4760 reconciler.go:26] "Reconciler: start to sync state" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.837135 4760 manager.go:324] Recovery completed Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.845678 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.847149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.847257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.847271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.848200 4760 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.848230 4760 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.848262 4760 state_mem.go:36] "Initialized new in-memory state store" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.860391 4760 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.862847 4760 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.862900 4760 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.862942 4760 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.863303 4760 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 12:13:17 crc kubenswrapper[4760]: W1204 12:13:17.869482 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.869543 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.870241 4760 policy_none.go:49] "None policy: Start" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.871743 4760 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.871773 4760 state_mem.go:35] "Initializing new in-memory state store" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.913716 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.929556 4760 manager.go:334] "Starting Device Plugin manager" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.929673 4760 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.929694 4760 server.go:79] "Starting device plugin registration server" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.930224 4760 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.930302 4760 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.930468 4760 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.930546 4760 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.930559 4760 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 12:13:17 crc kubenswrapper[4760]: E1204 12:13:17.938014 4760 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.963929 4760 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.964081 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.971403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.971465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.971475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.971694 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.972658 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.972774 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973135 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973409 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973655 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.973744 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974605 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974809 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.974830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.975073 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.975146 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976763 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.976976 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.977034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.977054 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.977098 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.977957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.978397 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.978412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.981746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982633 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982730 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.982830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.984603 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.984640 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:17 crc kubenswrapper[4760]: I1204 12:13:17.984652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.016356 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="400ms" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.030480 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.031796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.031848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.031880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.031920 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.032548 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.039952 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040010 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040037 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040349 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040447 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040487 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040506 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040534 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040627 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040787 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040886 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.040939 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142175 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142252 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142285 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142307 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142322 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142356 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142357 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142372 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142423 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142428 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142427 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142473 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142466 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142476 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142446 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142504 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142515 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142544 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142606 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142637 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142713 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142714 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142698 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142731 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142769 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142788 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142821 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142846 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.142892 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.233658 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.235330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.235388 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.235415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.235447 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.235884 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.297500 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.303813 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.330719 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.337693 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f02b96daf634648a2e62514fe3339d2eace1ca7afbcceeee675d67fde648b8f2 WatchSource:0}: Error finding container f02b96daf634648a2e62514fe3339d2eace1ca7afbcceeee675d67fde648b8f2: Status 404 returned error can't find the container with id f02b96daf634648a2e62514fe3339d2eace1ca7afbcceeee675d67fde648b8f2 Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.340070 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-98ec5b5491eaece0ce380a3da38fb1efc9e3263fb838cf970941776c0139460a WatchSource:0}: Error finding container 98ec5b5491eaece0ce380a3da38fb1efc9e3263fb838cf970941776c0139460a: Status 404 returned error can't find the container with id 98ec5b5491eaece0ce380a3da38fb1efc9e3263fb838cf970941776c0139460a Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.355595 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.360734 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.379072 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-92a7e1a27570c6a5dbbdd84bf20eb800b48c259e325585b5ec1fa9f421c67c2c WatchSource:0}: Error finding container 92a7e1a27570c6a5dbbdd84bf20eb800b48c259e325585b5ec1fa9f421c67c2c: Status 404 returned error can't find the container with id 92a7e1a27570c6a5dbbdd84bf20eb800b48c259e325585b5ec1fa9f421c67c2c Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.392287 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bae3dc34d9f3ca34fb9cc0a3ca085ab141f28a47a69458f59917a5ba868518f1 WatchSource:0}: Error finding container bae3dc34d9f3ca34fb9cc0a3ca085ab141f28a47a69458f59917a5ba868518f1: Status 404 returned error can't find the container with id bae3dc34d9f3ca34fb9cc0a3ca085ab141f28a47a69458f59917a5ba868518f1 Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.417374 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="800ms" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.636554 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.637814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.637849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.637858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.637880 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.638284 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.808680 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.866301 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b782607d350e85c5b0a4bd0de6fb3e6dd0e2a387c6e23a6b0f43b71c4f6f0537"} Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.867113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"92a7e1a27570c6a5dbbdd84bf20eb800b48c259e325585b5ec1fa9f421c67c2c"} Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.867690 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f02b96daf634648a2e62514fe3339d2eace1ca7afbcceeee675d67fde648b8f2"} Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.868243 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"98ec5b5491eaece0ce380a3da38fb1efc9e3263fb838cf970941776c0139460a"} Dec 04 12:13:18 crc kubenswrapper[4760]: I1204 12:13:18.868729 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bae3dc34d9f3ca34fb9cc0a3ca085ab141f28a47a69458f59917a5ba868518f1"} Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.978180 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.978529 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:18 crc kubenswrapper[4760]: W1204 12:13:18.994509 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:18 crc kubenswrapper[4760]: E1204 12:13:18.994613 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:19 crc kubenswrapper[4760]: E1204 12:13:19.218635 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="1.6s" Dec 04 12:13:19 crc kubenswrapper[4760]: W1204 12:13:19.244910 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:19 crc kubenswrapper[4760]: E1204 12:13:19.245050 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:19 crc kubenswrapper[4760]: W1204 12:13:19.256479 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:19 crc kubenswrapper[4760]: E1204 12:13:19.256578 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.439295 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.441666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.441727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.441740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.441776 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:19 crc kubenswrapper[4760]: E1204 12:13:19.443249 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.809859 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.842369 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 12:13:19 crc kubenswrapper[4760]: E1204 12:13:19.843758 4760 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.874198 4760 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2" exitCode=0 Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.874324 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.874376 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876791 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5" exitCode=0 Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876835 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.876926 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.878376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.878424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.878436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.879787 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b" exitCode=0 Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.879974 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.880008 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.880278 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.881715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.883272 4760 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5" exitCode=0 Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.883487 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.883873 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.884454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.884486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.884502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.886652 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.886685 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219"} Dec 04 12:13:19 crc kubenswrapper[4760]: I1204 12:13:19.886696 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64"} Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.808758 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:20 crc kubenswrapper[4760]: E1204 12:13:20.819580 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="3.2s" Dec 04 12:13:20 crc kubenswrapper[4760]: W1204 12:13:20.884702 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:20 crc kubenswrapper[4760]: E1204 12:13:20.884816 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.891167 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924" exitCode=0 Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.891342 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924"} Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.892886 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1"} Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.892975 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.894008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.894077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.894091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.894654 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047"} Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.894820 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.895731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.895770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.895785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.896246 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25"} Dec 04 12:13:20 crc kubenswrapper[4760]: I1204 12:13:20.897665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7"} Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.044057 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.045515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.045574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.045591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.045628 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:21 crc kubenswrapper[4760]: E1204 12:13:21.046172 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.107:6443: connect: connection refused" node="crc" Dec 04 12:13:21 crc kubenswrapper[4760]: W1204 12:13:21.166917 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:21 crc kubenswrapper[4760]: E1204 12:13:21.167318 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:21 crc kubenswrapper[4760]: W1204 12:13:21.394127 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:21 crc kubenswrapper[4760]: E1204 12:13:21.394230 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.809537 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.904689 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8"} Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.904739 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e"} Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.914648 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915042 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e"} Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759"} Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915152 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915186 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.915619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.916462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:21 crc kubenswrapper[4760]: I1204 12:13:21.939718 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:22 crc kubenswrapper[4760]: W1204 12:13:22.029694 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:22 crc kubenswrapper[4760]: E1204 12:13:22.029812 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.107:6443: connect: connection refused" logger="UnhandledError" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.809594 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.107:6443: connect: connection refused Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.921161 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177"} Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.921238 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597"} Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.921335 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.922858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.922900 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.922912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.923926 4760 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f" exitCode=0 Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924026 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924031 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f"} Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924167 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924253 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.924826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925360 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:22 crc kubenswrapper[4760]: I1204 12:13:22.925448 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.930906 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f"} Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.930970 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281"} Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.930988 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b"} Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.930997 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.931049 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.932118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.932174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.932188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:23 crc kubenswrapper[4760]: I1204 12:13:23.959632 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.246809 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.248117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.248153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.248166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.248192 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.676302 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.676479 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.677948 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.677990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.678000 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.944239 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609"} Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.944294 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68"} Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.944328 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.945137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.945164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:24 crc kubenswrapper[4760]: I1204 12:13:24.945174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.131026 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.131390 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.132925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.132983 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.132993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.203573 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.204183 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.205828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.205871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.205885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.451071 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.451320 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.451387 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.452935 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.452995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.453019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.595587 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.666845 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.755754 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.946016 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.946016 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.946103 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947232 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947329 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947346 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947383 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:25 crc kubenswrapper[4760]: I1204 12:13:25.947417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.518657 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.948696 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.948696 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.949715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.949755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.949767 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.950044 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.950076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:26 crc kubenswrapper[4760]: I1204 12:13:26.950088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.665054 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.665283 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.666667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.666720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.666734 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.677301 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 12:13:27 crc kubenswrapper[4760]: I1204 12:13:27.677409 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:13:27 crc kubenswrapper[4760]: E1204 12:13:27.938196 4760 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 12:13:31 crc kubenswrapper[4760]: I1204 12:13:31.946409 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:31 crc kubenswrapper[4760]: I1204 12:13:31.946579 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:31 crc kubenswrapper[4760]: I1204 12:13:31.947961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:31 crc kubenswrapper[4760]: I1204 12:13:31.948010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:31 crc kubenswrapper[4760]: I1204 12:13:31.948022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:32 crc kubenswrapper[4760]: I1204 12:13:32.062805 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 12:13:32 crc kubenswrapper[4760]: I1204 12:13:32.063457 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:32 crc kubenswrapper[4760]: I1204 12:13:32.065338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:32 crc kubenswrapper[4760]: I1204 12:13:32.065392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:32 crc kubenswrapper[4760]: I1204 12:13:32.065405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:33 crc kubenswrapper[4760]: I1204 12:13:33.809793 4760 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 12:13:33 crc kubenswrapper[4760]: E1204 12:13:33.961829 4760 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 12:13:34 crc kubenswrapper[4760]: E1204 12:13:34.020908 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 04 12:13:34 crc kubenswrapper[4760]: E1204 12:13:34.249537 4760 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 04 12:13:34 crc kubenswrapper[4760]: W1204 12:13:34.807070 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.807182 4760 trace.go:236] Trace[1546941713]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 12:13:24.805) (total time: 10001ms): Dec 04 12:13:34 crc kubenswrapper[4760]: Trace[1546941713]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:13:34.807) Dec 04 12:13:34 crc kubenswrapper[4760]: Trace[1546941713]: [10.001258373s] [10.001258373s] END Dec 04 12:13:34 crc kubenswrapper[4760]: E1204 12:13:34.807236 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.975981 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.979016 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177" exitCode=255 Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.979107 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177"} Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.979405 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.980367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.980401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.980412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:34 crc kubenswrapper[4760]: I1204 12:13:34.981034 4760 scope.go:117] "RemoveContainer" containerID="4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.067267 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:35 crc kubenswrapper[4760]: W1204 12:13:35.132269 4760 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.132378 4760 trace.go:236] Trace[1608121280]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 12:13:25.130) (total time: 10001ms): Dec 04 12:13:35 crc kubenswrapper[4760]: Trace[1608121280]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:13:35.132) Dec 04 12:13:35 crc kubenswrapper[4760]: Trace[1608121280]: [10.001724035s] [10.001724035s] END Dec 04 12:13:35 crc kubenswrapper[4760]: E1204 12:13:35.132420 4760 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.418626 4760 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.418719 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.423341 4760 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.423448 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.485873 4760 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]log ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]etcd ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/priority-and-fairness-filter ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-apiextensions-informers ok Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/crd-informer-synced failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-system-namespaces-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/bootstrap-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/start-kube-aggregator-informers ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 04 12:13:35 crc kubenswrapper[4760]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]autoregister-completion ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/apiservice-openapi-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 04 12:13:35 crc kubenswrapper[4760]: livez check failed Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.486003 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.984814 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.986351 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f"} Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.986545 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.987486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.987514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:35 crc kubenswrapper[4760]: I1204 12:13:35.987522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.025442 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.026101 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.026508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.026555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.026570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.677294 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 12:13:37 crc kubenswrapper[4760]: I1204 12:13:37.677379 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 12:13:37 crc kubenswrapper[4760]: E1204 12:13:37.939058 4760 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 12:13:38 crc kubenswrapper[4760]: I1204 12:13:38.027514 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:38 crc kubenswrapper[4760]: I1204 12:13:38.028336 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:38 crc kubenswrapper[4760]: I1204 12:13:38.028362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:38 crc kubenswrapper[4760]: I1204 12:13:38.028370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.407410 4760 trace.go:236] Trace[141743438]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 12:13:26.894) (total time: 13512ms): Dec 04 12:13:40 crc kubenswrapper[4760]: Trace[141743438]: ---"Objects listed" error: 13512ms (12:13:40.407) Dec 04 12:13:40 crc kubenswrapper[4760]: Trace[141743438]: [13.512430652s] [13.512430652s] END Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.407451 4760 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.415607 4760 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.424461 4760 trace.go:236] Trace[1034039236]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 12:13:26.399) (total time: 14024ms): Dec 04 12:13:40 crc kubenswrapper[4760]: Trace[1034039236]: ---"Objects listed" error: 14024ms (12:13:40.424) Dec 04 12:13:40 crc kubenswrapper[4760]: Trace[1034039236]: [14.024435816s] [14.024435816s] END Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.424512 4760 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.456187 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.456456 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.458027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.458064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.458108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.460261 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.649672 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.651567 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.651630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.651643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.651806 4760 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.674565 4760 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.675106 4760 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.675163 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.680432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.680512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.680523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.680541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.680554 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:40Z","lastTransitionTime":"2025-12-04T12:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.695261 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.700424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.700496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.700509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.700534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.700550 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:40Z","lastTransitionTime":"2025-12-04T12:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.711528 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.717265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.717317 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.717333 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.717351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.717362 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:40Z","lastTransitionTime":"2025-12-04T12:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.729770 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.733471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.733498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.733507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.733522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:40 crc kubenswrapper[4760]: I1204 12:13:40.733546 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:40Z","lastTransitionTime":"2025-12-04T12:13:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.743445 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.743591 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.743614 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.843874 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:40 crc kubenswrapper[4760]: E1204 12:13:40.944379 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: I1204 12:13:41.036138 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:41 crc kubenswrapper[4760]: I1204 12:13:41.037177 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:41 crc kubenswrapper[4760]: I1204 12:13:41.037243 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:41 crc kubenswrapper[4760]: I1204 12:13:41.037260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.044876 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.145610 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.245866 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: I1204 12:13:41.344495 4760 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.346136 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.446558 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.547600 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.648481 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.748636 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.849245 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:41 crc kubenswrapper[4760]: E1204 12:13:41.949805 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: E1204 12:13:42.050752 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: E1204 12:13:42.190307 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.190590 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.190774 4760 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.191857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.191900 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.191914 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.202263 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 12:13:42 crc kubenswrapper[4760]: E1204 12:13:42.290849 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.317494 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.331816 4760 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 04 12:13:42 crc kubenswrapper[4760]: E1204 12:13:42.392037 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: E1204 12:13:42.493346 4760 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.499241 4760 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.595953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.596010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.596024 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.596041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.596057 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:42Z","lastTransitionTime":"2025-12-04T12:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.699815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.699867 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.699879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.699898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.699911 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:42Z","lastTransitionTime":"2025-12-04T12:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.802513 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.802556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.802568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.802598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.802608 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:42Z","lastTransitionTime":"2025-12-04T12:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.812761 4760 apiserver.go:52] "Watching apiserver" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.905688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.905735 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.905746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.905766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:42 crc kubenswrapper[4760]: I1204 12:13:42.905786 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:42Z","lastTransitionTime":"2025-12-04T12:13:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.008844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.008905 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.008919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.008934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.008944 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.095074 4760 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.095376 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.095787 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.095939 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.096009 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.096056 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.096505 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.096801 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.096845 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.096889 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.096977 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.101927 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.102365 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.102457 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.102607 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.102753 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.103966 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.104653 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.105911 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.106104 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.114858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.114939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.114950 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.114970 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.114985 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.115320 4760 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.132515 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.132755 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.132896 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.132989 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.133490 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.133682 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.133888 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.134320 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135200 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135379 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135417 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135443 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135464 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135486 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135517 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135548 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135605 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135631 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135656 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135677 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135698 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135723 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135750 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135773 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135856 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135877 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135900 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135937 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135960 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135979 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.135996 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136012 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136055 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136074 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136094 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136114 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136134 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136310 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136337 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136360 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136419 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136442 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136464 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136487 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136513 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136534 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136555 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136576 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136597 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136620 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136620 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136642 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136679 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136703 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136726 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136747 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136771 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136795 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136819 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136839 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136864 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136885 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136905 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136925 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136948 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.136971 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137002 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137034 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137067 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137101 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137170 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137190 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137231 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137251 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137270 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137295 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137313 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137332 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137354 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137373 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137381 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137395 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137417 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137435 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137452 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137475 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137492 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137510 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137529 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137547 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137566 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137583 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137600 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137618 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137650 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137669 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137687 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137705 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137721 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137742 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137761 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137792 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137840 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137860 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137878 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137896 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137957 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.137992 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138018 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138039 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138058 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138077 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138095 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138114 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138138 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138156 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138175 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138192 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138229 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138248 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138265 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138288 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138318 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138344 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138373 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138398 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.138419 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.139100 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140451 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140520 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140554 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140610 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140658 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140691 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140715 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140736 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140760 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140782 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140807 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140828 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140881 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140901 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140926 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140953 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140935 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.140977 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141004 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141028 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141053 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141099 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141153 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141175 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141415 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141439 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141492 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141516 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141510 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141559 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141592 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141632 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141658 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141679 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141731 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141751 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141915 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141956 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.141993 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142025 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142060 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142094 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142098 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142128 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142157 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142187 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142238 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142270 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142345 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142378 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142408 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142433 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142461 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142484 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142507 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142535 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142580 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142615 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142649 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142672 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142701 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142747 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142772 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142800 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142866 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142901 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142956 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142983 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143008 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143037 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143085 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143105 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143161 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143188 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143228 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143253 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143350 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143368 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143382 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143397 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143412 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143431 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143446 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143461 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143476 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143490 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143505 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143519 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.229776 4760 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.248952 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.252345 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.285700 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.142579 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143052 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.143594 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143598 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143708 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.143977 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.144297 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.144481 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.144955 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.145273 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.145349 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.145478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.145672 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.145939 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.146050 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.146355 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.146713 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.147026 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.147093 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.147586 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.148241 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.158374 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.158953 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.159022 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.159182 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.159409 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.159575 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.160174 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.160361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.160456 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.160715 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.160905 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.161148 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.161243 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.161611 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.161805 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162133 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162183 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162406 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162466 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162681 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.162768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.163187 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.163607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.163750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.164265 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.164416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.164625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.164807 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.164820 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.165078 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.165346 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.165627 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.165795 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.165943 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166139 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166165 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166584 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166784 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.166926 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167121 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167165 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167337 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167475 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167597 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167783 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.167866 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168008 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168122 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168339 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168606 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168721 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168721 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.168949 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.169146 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.169171 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.169257 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.169710 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.169686 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.170038 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.170257 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.170361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.170653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.170976 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.171153 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.171418 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.171605 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.171932 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.216744 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.217036 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.217504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.217748 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.217945 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.218251 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.218307 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.218678 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.219119 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.219425 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.219463 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220058 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220042 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220067 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220359 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220488 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220518 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.220754 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.222296 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.222914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.223492 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.223638 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.223821 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.224531 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.225357 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.225857 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.226073 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.226533 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.226806 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227012 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227594 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227651 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227672 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.227845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.228485 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.229664 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.230231 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.230973 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.231007 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.231111 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.232111 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.232536 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.233775 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.235354 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:13:43.735324091 +0000 UTC m=+26.776770658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.287725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.238297 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.238484 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.239111 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.239335 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.239820 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.240003 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.240178 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.254437 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.254648 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.254839 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.255038 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.255347 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.255550 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.255912 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.256159 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.266384 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.284414 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288193 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.288233 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:43.788190345 +0000 UTC m=+26.829636912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.288287 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:43.788266788 +0000 UTC m=+26.829713435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288328 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288355 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288377 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288376 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288475 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288489 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288401 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288585 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288609 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288641 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288652 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288655 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288691 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288714 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes/kubernetes.io~secret/srv-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288713 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288758 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288781 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288803 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288833 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288761 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288900 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288802 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288919 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288836 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288936 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288872 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288950 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288929 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288965 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.288968 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288984 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289013 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289023 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289041 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289049 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289081 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~projected/kube-api-access Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289090 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289100 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289098 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289162 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289173 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289176 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes/kubernetes.io~projected/kube-api-access-fqsjt Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289196 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.289204 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~projected/kube-api-access-qg5z5 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289249 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.288882 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289290 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289319 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289373 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289402 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289427 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289456 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289485 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289509 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289535 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289560 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289584 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289607 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289630 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289653 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289675 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289700 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289722 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289748 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289772 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289794 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289816 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289860 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289881 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289901 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289925 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289947 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289968 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.289991 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290014 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290036 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290059 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290085 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290106 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290133 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290153 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290191 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290242 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290271 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290291 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290312 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290337 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290359 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290385 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290414 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290435 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290456 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290479 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290500 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290524 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290545 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290597 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290619 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290640 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290660 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290682 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290703 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290726 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290752 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290773 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290794 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290817 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290840 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290871 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290894 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290917 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290939 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290960 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.290984 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291005 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291037 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291059 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291099 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291122 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291167 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.291188 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302660 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302723 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302749 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302768 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302784 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302887 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.302915 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303016 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303029 4760 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303040 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303049 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303059 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303068 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303078 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303090 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303100 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303109 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303118 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303127 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303137 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303148 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303156 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303165 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303173 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303182 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303191 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303200 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303232 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303241 4760 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303250 4760 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303260 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303269 4760 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303277 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303286 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303294 4760 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303303 4760 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303312 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303321 4760 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303330 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303342 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303351 4760 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303359 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303368 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303377 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303386 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303394 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303405 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303413 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303422 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303497 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303506 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303514 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303523 4760 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303532 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303552 4760 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303562 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303571 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303579 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303589 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303598 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303607 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303616 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303625 4760 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303633 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303642 4760 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303651 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303661 4760 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303669 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303679 4760 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303687 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303697 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303707 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303718 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303727 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303736 4760 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303745 4760 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303754 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303762 4760 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303808 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.303897 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303907 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.303946 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-user-idp-0-file-data Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303952 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.303990 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes/kubernetes.io~secret/profile-collector-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.303996 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.304031 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes/kubernetes.io~secret/package-server-manager-serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.304036 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.304069 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~secret/metrics-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.304076 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.304111 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.304116 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.313983 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.314371 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes/kubernetes.io~projected/kube-api-access-pjr6v Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314390 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.314477 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes/kubernetes.io~projected/kube-api-access-xcgwh Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314487 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.314577 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314591 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314655 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.314694 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314707 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.314797 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.314809 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes/kubernetes.io~projected/kube-api-access-nzwt7 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.314824 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.314814 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.314896 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.315184 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:43.815094228 +0000 UTC m=+26.856540795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315338 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes/kubernetes.io~projected/kube-api-access-w9rds Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315372 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315523 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~secret/webhook-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315537 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315598 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315609 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315657 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~secret/machine-approver-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315664 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315719 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes/kubernetes.io~projected/kube-api-access-jkwtn Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315809 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315817 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315867 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315874 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315918 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~secret/console-oauth-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315924 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.315970 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~projected/kube-api-access-mnrrd Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.315976 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316035 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~projected/kube-api-access-bf2bz Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316045 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316085 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~projected/kube-api-access-7c4vf Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316091 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316144 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~secret/etcd-client Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316152 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316196 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316202 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316265 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316272 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316319 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316325 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316368 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316376 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316422 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316428 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316466 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316471 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316508 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-user-template-provider-selection Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316514 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316554 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes/kubernetes.io~projected/kube-api-access-2w9zh Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316560 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316597 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~secret/console-serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316603 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316633 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316639 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316677 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~projected/bound-sa-token Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316683 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316719 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316762 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~projected/kube-api-access-2d4wz Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316767 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316803 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-system-ocp-branding-template Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316810 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316849 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes/kubernetes.io~secret/proxy-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316854 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316887 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~projected/registry-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316893 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316931 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~projected/kube-api-access-w7l8j Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316937 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.316973 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.316980 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.317013 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.317019 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.317075 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~projected/kube-api-access-rnphk Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.317082 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.317118 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.317160 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.317197 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~projected/kube-api-access-d6qdx Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.317217 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.323571 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~projected/kube-api-access-kfwg7 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.323619 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.323703 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes/kubernetes.io~secret/srv-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.323715 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.323772 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.323779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335089 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335190 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~secret/apiservice-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335273 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335178 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335379 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335393 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335431 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335549 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes/kubernetes.io~configmap/multus-daemon-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335561 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335641 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335649 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.335775 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~projected/kube-api-access-wxkg8 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.335789 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.336993 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.337060 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.337185 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes/kubernetes.io~secret/certs Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.337204 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.337294 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~secret/signing-key Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.337304 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.337372 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~projected/bound-sa-token Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.337389 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.337464 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.337474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.346280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.355448 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.355988 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.357409 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~projected/kube-api-access-s4n52 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.357502 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.357702 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~projected/kube-api-access-6ccd8 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.357716 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.357803 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.357836 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358152 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.358499 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.358521 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.358541 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358575 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~projected/kube-api-access-8tdtz Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.358621 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:43.858596089 +0000 UTC m=+26.900042656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358616 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358676 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~secret/installation-pull-secrets Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358686 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358774 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes/kubernetes.io~projected/kube-api-access-249nr Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358782 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358827 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes/kubernetes.io~secret/webhook-certs Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358835 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358898 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358906 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.358957 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes/kubernetes.io~configmap/ovnkube-config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.358966 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359055 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~secret/etcd-client Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359066 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359073 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359147 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~projected/kube-api-access-mg5zb Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359159 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359290 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359300 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359292 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~projected/kube-api-access-sb6h7 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359374 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359462 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~secret/proxy-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359472 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.359659 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes/kubernetes.io~secret/node-bootstrap-token Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.359743 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.360696 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.360803 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes/kubernetes.io~secret/cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.360815 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.360875 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~secret/image-registry-operator-tls Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.360883 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.363747 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.363951 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.364005 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.364015 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.364733 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.365432 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367485 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367626 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.367654 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367772 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367790 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367947 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.367967 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.367966 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~secret/serving-cert Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368302 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368314 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368029 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368273 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368358 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.368960 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.369203 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.369284 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.369498 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.369542 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-user-template-login Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.369600 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.369666 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-user-template-error Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.369685 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.370304 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.370320 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.395137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.395401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.395472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.395544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.395600 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404708 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404744 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404756 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404770 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404783 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404793 4760 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404804 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404816 4760 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404827 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404837 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404848 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404859 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404881 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404893 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404904 4760 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404915 4760 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404926 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404936 4760 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404947 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404958 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404968 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404979 4760 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.404990 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405001 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405013 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405024 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405034 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405045 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405056 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405066 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405076 4760 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405086 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405097 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405107 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405120 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405156 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405167 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405178 4760 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405190 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405201 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405230 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405241 4760 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405251 4760 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405263 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405273 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405284 4760 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405294 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405306 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405320 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405331 4760 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405342 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405353 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405364 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405375 4760 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405386 4760 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405396 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405406 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405418 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405430 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405441 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405452 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405463 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405473 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405485 4760 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405495 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405506 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405518 4760 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405529 4760 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405540 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405551 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405562 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405572 4760 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405585 4760 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405598 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405610 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405620 4760 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405631 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405642 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405652 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405663 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405674 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405685 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405698 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405709 4760 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405721 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405733 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405744 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405756 4760 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405767 4760 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405778 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405788 4760 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405799 4760 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405810 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405820 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405831 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405842 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405854 4760 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405866 4760 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405877 4760 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405890 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405901 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405914 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405925 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405938 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.405948 4760 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.408439 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.409717 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.411205 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.411684 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.414728 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.437015 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.450362 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.451116 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.452399 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.471483 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.501186 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.504037 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.507084 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.507527 4760 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.508067 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.508104 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.508119 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.514075 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.555336 4760 csr.go:261] certificate signing request csr-8hlgd is approved, waiting to be issued Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.557948 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.558295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.558350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.558362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.558384 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.558396 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.609051 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675590 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.675749 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.701721 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.703265 4760 csr.go:257] certificate signing request csr-8hlgd is issued Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.729148 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.737559 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: W1204 12:13:43.743047 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-cf96a99fb688a928cca05b8c638c74f5dc5412b69a1b7e2710f1ad4e74be2022 WatchSource:0}: Error finding container cf96a99fb688a928cca05b8c638c74f5dc5412b69a1b7e2710f1ad4e74be2022: Status 404 returned error can't find the container with id cf96a99fb688a928cca05b8c638c74f5dc5412b69a1b7e2710f1ad4e74be2022 Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.782305 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.784793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.784833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.784844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.784861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.784872 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.800079 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.814596 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.814718 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.814755 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.814845 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.814906 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:44.814887445 +0000 UTC m=+27.856334012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.815496 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.815554 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:44.815542777 +0000 UTC m=+27.856989344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.815644 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:13:44.815607699 +0000 UTC m=+27.857054266 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.871447 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.872174 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.873473 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.875147 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.875990 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.877510 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.878774 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.880360 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.881714 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.883370 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.884072 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.885605 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.886887 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887583 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887965 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.887991 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.888861 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.889480 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.890590 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.891003 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.891599 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.892684 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.893159 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.894793 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.895320 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.896621 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.897131 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.897844 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.899027 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.899648 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.901294 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.901894 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.903355 4760 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.903501 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.905579 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.907977 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.914922 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.915988 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.916058 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916293 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916328 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916346 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916414 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:44.916392722 +0000 UTC m=+27.957839289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916920 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916945 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.916951 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.916959 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: E1204 12:13:43.917327 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:44.917311812 +0000 UTC m=+27.958758379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.917814 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.919106 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.919917 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.921128 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.921723 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.923015 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.925136 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.928315 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.928846 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.931149 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.931946 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.933543 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.934183 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.935160 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.935766 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.937060 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.937805 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.938428 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.991561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.991616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.991632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.991656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:43 crc kubenswrapper[4760]: I1204 12:13:43.991672 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:43Z","lastTransitionTime":"2025-12-04T12:13:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.049449 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b3093a7ef20982cbdf4469d19eb66fcef34275d2d7c79497e57cd288e7549632"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.052423 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.052466 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a19159c4d989455d2e0034056729fe9ea8be9464a65f9dadd126c577a7901d38"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.056087 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.056632 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.060629 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" exitCode=255 Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.060692 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.060760 4760 scope.go:117] "RemoveContainer" containerID="4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.064018 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.064112 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cf96a99fb688a928cca05b8c638c74f5dc5412b69a1b7e2710f1ad4e74be2022"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.069614 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.095084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.095127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.095138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.095158 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.095168 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.099693 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.118940 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.122573 4760 scope.go:117] "RemoveContainer" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.122830 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.126807 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.198258 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.198306 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.198316 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.198335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.198350 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.260345 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.303274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.303323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.303336 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.303355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.303381 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.366630 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.382942 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.404051 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.412615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.412651 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.412661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.412680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.412692 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.417084 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.436672 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.457032 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.471841 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.486114 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.506049 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.515917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.515957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.515968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.515985 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.515997 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.523863 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:34Z\\\",\\\"message\\\":\\\"W1204 12:13:23.068146 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 12:13:23.068826 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764850403 cert, and key in /tmp/serving-cert-2319760993/serving-signer.crt, /tmp/serving-cert-2319760993/serving-signer.key\\\\nI1204 12:13:23.332347 1 observer_polling.go:159] Starting file observer\\\\nW1204 12:13:23.338635 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 12:13:23.338795 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:23.339514 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319760993/tls.crt::/tmp/serving-cert-2319760993/tls.key\\\\\\\"\\\\nF1204 12:13:33.830476 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.552406 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.618855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.618899 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.618909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.618931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.618949 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.667836 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4br74"] Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.668224 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.670311 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.670865 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.671786 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.688250 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.688344 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.699198 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.703938 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-04 12:08:43 +0000 UTC, rotation deadline is 2026-09-02 14:08:21.657373197 +0000 UTC Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.703972 4760 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6529h54m36.95340401s for next certificate rotation Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.704303 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.722198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.722256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.722267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.722286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.722301 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.728111 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f60604a-f694-4df9-bb00-117eb8e9f325-hosts-file\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.728289 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9sj2\" (UniqueName: \"kubernetes.io/projected/6f60604a-f694-4df9-bb00-117eb8e9f325-kube-api-access-m9sj2\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.743055 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.764305 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.782584 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.799614 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.817294 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.824961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.824991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.825001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.825017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.825027 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829146 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829271 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829316 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f60604a-f694-4df9-bb00-117eb8e9f325-hosts-file\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829378 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9sj2\" (UniqueName: \"kubernetes.io/projected/6f60604a-f694-4df9-bb00-117eb8e9f325-kube-api-access-m9sj2\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.829444 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:13:46.829410469 +0000 UTC m=+29.870857036 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.829467 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6f60604a-f694-4df9-bb00-117eb8e9f325-hosts-file\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.829502 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.829526 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.829597 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:46.829577093 +0000 UTC m=+29.871023710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.829624 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:46.829611005 +0000 UTC m=+29.871057682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.835732 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.850051 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9sj2\" (UniqueName: \"kubernetes.io/projected/6f60604a-f694-4df9-bb00-117eb8e9f325-kube-api-access-m9sj2\") pod \"node-resolver-4br74\" (UID: \"6f60604a-f694-4df9-bb00-117eb8e9f325\") " pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.863857 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.864003 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.864381 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.864431 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.864477 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.864516 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.925494 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:34Z\\\",\\\"message\\\":\\\"W1204 12:13:23.068146 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 12:13:23.068826 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764850403 cert, and key in /tmp/serving-cert-2319760993/serving-signer.crt, /tmp/serving-cert-2319760993/serving-signer.key\\\\nI1204 12:13:23.332347 1 observer_polling.go:159] Starting file observer\\\\nW1204 12:13:23.338635 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 12:13:23.338795 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:23.339514 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319760993/tls.crt::/tmp/serving-cert-2319760993/tls.key\\\\\\\"\\\\nF1204 12:13:33.830476 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:44Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.930528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.930565 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.930575 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.930595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.930606 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:44Z","lastTransitionTime":"2025-12-04T12:13:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.932254 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.932307 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932473 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932502 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932518 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932519 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932561 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932576 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932582 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:46.932564301 +0000 UTC m=+29.974010868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:44 crc kubenswrapper[4760]: E1204 12:13:44.932637 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:46.932618533 +0000 UTC m=+29.974065170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:44 crc kubenswrapper[4760]: I1204 12:13:44.983780 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4br74" Dec 04 12:13:45 crc kubenswrapper[4760]: W1204 12:13:45.017892 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f60604a_f694_4df9_bb00_117eb8e9f325.slice/crio-a004deba2c752e5ec7ccf4a790f2365639323ba2ff882229f0ec4c221d106c63 WatchSource:0}: Error finding container a004deba2c752e5ec7ccf4a790f2365639323ba2ff882229f0ec4c221d106c63: Status 404 returned error can't find the container with id a004deba2c752e5ec7ccf4a790f2365639323ba2ff882229f0ec4c221d106c63 Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.033571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.033627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.033642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.033660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.033674 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.034446 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.044375 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.068035 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4br74" event={"ID":"6f60604a-f694-4df9-bb00-117eb8e9f325","Type":"ContainerStarted","Data":"a004deba2c752e5ec7ccf4a790f2365639323ba2ff882229f0ec4c221d106c63"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.070327 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.073687 4760 scope.go:117] "RemoveContainer" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" Dec 04 12:13:45 crc kubenswrapper[4760]: E1204 12:13:45.073918 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.074006 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.109802 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: E1204 12:13:45.114144 4760 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.163989 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bvk2c"] Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164614 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dg5hd"] Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164785 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jnrr9"] Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.165087 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.165189 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.164790 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.165441 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.165723 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.168455 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.168685 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.169990 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.170161 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.171114 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.171113 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.171370 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.171427 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.171500 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.201942 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.202231 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267428 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-bin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267494 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgd2p\" (UniqueName: \"kubernetes.io/projected/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-kube-api-access-cgd2p\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267531 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267554 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65f76314-9511-40ed-9ad6-2220378e7e97-proxy-tls\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267596 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-multus-certs\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf8ph\" (UniqueName: \"kubernetes.io/projected/65f76314-9511-40ed-9ad6-2220378e7e97-kube-api-access-cf8ph\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267670 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cnibin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267863 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-kubelet\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267882 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cnibin\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267905 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-k8s-cni-cncf-io\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.267950 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65f76314-9511-40ed-9ad6-2220378e7e97-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268155 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-hostroot\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8d2\" (UniqueName: \"kubernetes.io/projected/bc874ce0-7f43-4ba9-921a-dd8141d738a1-kube-api-access-bk8d2\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268284 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-multus\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268306 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-daemon-config\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268329 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-netns\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268403 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-etc-kubernetes\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268431 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268483 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-socket-dir-parent\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268500 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-conf-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268517 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65f76314-9511-40ed-9ad6-2220378e7e97-rootfs\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268567 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268585 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-os-release\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268599 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-system-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268632 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-os-release\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268646 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cni-binary-copy\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.268756 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.269365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.269390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.269401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.269418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.269428 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.278290 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.298508 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.312627 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.348112 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bdaad5720b2d90a5722cfc0dfcff2938da3604ef9a6f7898310137af5653177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:34Z\\\",\\\"message\\\":\\\"W1204 12:13:23.068146 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 12:13:23.068826 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764850403 cert, and key in /tmp/serving-cert-2319760993/serving-signer.crt, /tmp/serving-cert-2319760993/serving-signer.key\\\\nI1204 12:13:23.332347 1 observer_polling.go:159] Starting file observer\\\\nW1204 12:13:23.338635 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 12:13:23.338795 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:23.339514 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2319760993/tls.crt::/tmp/serving-cert-2319760993/tls.key\\\\\\\"\\\\nF1204 12:13:33.830476 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.365492 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369358 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-hostroot\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369405 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8d2\" (UniqueName: \"kubernetes.io/projected/bc874ce0-7f43-4ba9-921a-dd8141d738a1-kube-api-access-bk8d2\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369425 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-multus\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369443 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-daemon-config\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-netns\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369482 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-hostroot\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369594 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-multus\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369627 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-netns\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.369797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-etc-kubernetes\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370288 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-daemon-config\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370327 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-etc-kubernetes\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370384 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-socket-dir-parent\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370401 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-conf-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65f76314-9511-40ed-9ad6-2220378e7e97-rootfs\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370498 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-conf-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371150 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65f76314-9511-40ed-9ad6-2220378e7e97-rootfs\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.370498 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-socket-dir-parent\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371298 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371320 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-os-release\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371342 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371377 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-system-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371430 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-os-release\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cni-binary-copy\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372013 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-bin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372037 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgd2p\" (UniqueName: \"kubernetes.io/projected/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-kube-api-access-cgd2p\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372445 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372478 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65f76314-9511-40ed-9ad6-2220378e7e97-proxy-tls\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372535 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-multus-certs\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372557 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf8ph\" (UniqueName: \"kubernetes.io/projected/65f76314-9511-40ed-9ad6-2220378e7e97-kube-api-access-cf8ph\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372590 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cnibin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372652 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-kubelet\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cnibin\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372708 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372726 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-k8s-cni-cncf-io\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372761 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65f76314-9511-40ed-9ad6-2220378e7e97-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.373368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65f76314-9511-40ed-9ad6-2220378e7e97-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-os-release\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371579 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-system-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374461 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cnibin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374481 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-cni-binary-copy\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374497 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374548 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cnibin\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.371491 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-os-release\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.372083 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-cni-bin\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374572 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-multus-certs\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374621 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-run-k8s-cni-cncf-io\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374381 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-host-var-lib-kubelet\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-multus-cni-dir\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.374872 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc874ce0-7f43-4ba9-921a-dd8141d738a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.375403 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc874ce0-7f43-4ba9-921a-dd8141d738a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.377178 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65f76314-9511-40ed-9ad6-2220378e7e97-proxy-tls\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.379084 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.394694 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.404622 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf8ph\" (UniqueName: \"kubernetes.io/projected/65f76314-9511-40ed-9ad6-2220378e7e97-kube-api-access-cf8ph\") pod \"machine-config-daemon-jnrr9\" (UID: \"65f76314-9511-40ed-9ad6-2220378e7e97\") " pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.410640 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8d2\" (UniqueName: \"kubernetes.io/projected/bc874ce0-7f43-4ba9-921a-dd8141d738a1-kube-api-access-bk8d2\") pod \"multus-additional-cni-plugins-bvk2c\" (UID: \"bc874ce0-7f43-4ba9-921a-dd8141d738a1\") " pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.411411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgd2p\" (UniqueName: \"kubernetes.io/projected/017b9fc1-6db4-4786-81f1-6cb9b09c90a3-kube-api-access-cgd2p\") pod \"multus-dg5hd\" (UID: \"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\") " pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.415417 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.434657 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.456942 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.477235 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.478050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.478085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.478096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.478113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.478126 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.502427 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.522651 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.541511 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.556375 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.572590 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.581555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.581601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.581615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.581634 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.581649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.587141 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.599730 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dg5hd" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.606974 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.611952 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.634183 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: W1204 12:13:45.641079 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017b9fc1_6db4_4786_81f1_6cb9b09c90a3.slice/crio-d891056b593d9e9495d0fc46a291c67b21ed719e604d4d40e7cb1a98307db62e WatchSource:0}: Error finding container d891056b593d9e9495d0fc46a291c67b21ed719e604d4d40e7cb1a98307db62e: Status 404 returned error can't find the container with id d891056b593d9e9495d0fc46a291c67b21ed719e604d4d40e7cb1a98307db62e Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.645975 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q8b49"] Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.647021 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650187 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650503 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650875 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650926 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650998 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.650896 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.651153 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.655438 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.683884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.683910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.683918 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.683931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.683940 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.690096 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.709628 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.729194 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.760879 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777530 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777602 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777636 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777683 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777709 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777754 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777848 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.777977 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778024 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778058 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778086 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778135 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778164 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778186 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778227 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvpf\" (UniqueName: \"kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778270 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778293 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778313 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.778334 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.779836 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.796281 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.796305 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.796314 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.796328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.796338 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.797813 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.824972 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.847741 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.875525 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880773 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880825 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880855 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880883 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880928 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880950 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880971 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.880992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881011 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881031 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881082 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881104 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881129 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881151 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881196 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881247 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881271 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881340 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvpf\" (UniqueName: \"kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881723 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.881784 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882132 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882152 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882189 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882242 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882253 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882311 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882340 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882361 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882359 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.882750 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.883133 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.883255 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.886128 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.893354 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.903581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.903687 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.903707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.904023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.904545 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:45Z","lastTransitionTime":"2025-12-04T12:13:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.922664 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvpf\" (UniqueName: \"kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf\") pod \"ovnkube-node-q8b49\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.922664 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.961707 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.966065 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:45 crc kubenswrapper[4760]: I1204 12:13:45.980598 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:45Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.046822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.046872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.046883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.046904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.046915 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.048603 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: W1204 12:13:46.060986 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69907424_ac0b_4430_b508_af165754104f.slice/crio-9f748fec3a4fc962610a6231c9856a92f74d102c9946e071562b2bc03e3c1e8b WatchSource:0}: Error finding container 9f748fec3a4fc962610a6231c9856a92f74d102c9946e071562b2bc03e3c1e8b: Status 404 returned error can't find the container with id 9f748fec3a4fc962610a6231c9856a92f74d102c9946e071562b2bc03e3c1e8b Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.071202 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.080287 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"9f748fec3a4fc962610a6231c9856a92f74d102c9946e071562b2bc03e3c1e8b"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.081469 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerStarted","Data":"b19aa7f1c5d441231cbd09a65103bf9da6a8f162b870d161a43c59aa5c1061a8"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.082771 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerStarted","Data":"1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.082809 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerStarted","Data":"d891056b593d9e9495d0fc46a291c67b21ed719e604d4d40e7cb1a98307db62e"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.085942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.085985 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.086005 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"5fc613e92c20ced3dc28258ab79d1475a01d875395f75ddf039080e7875d4517"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.097108 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4br74" event={"ID":"6f60604a-f694-4df9-bb00-117eb8e9f325","Type":"ContainerStarted","Data":"d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.097983 4760 scope.go:117] "RemoveContainer" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.098170 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.100592 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.119754 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.149740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.149773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.149783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.149800 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.149810 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.189033 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.211090 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.246199 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.261332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.261440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.261450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.261470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.261482 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.275380 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.295906 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.314599 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.333620 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.351603 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.364853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.364912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.364928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.364953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.364969 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.376446 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.394084 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.420454 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.438524 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.466703 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.469040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.469088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.469099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.469118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.469133 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.496811 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.573099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.573159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.573176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.573198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.573230 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.678796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.678836 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.678846 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.678863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.678873 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.781787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.781829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.781843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.781860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.781872 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.852692 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.852822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.852872 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.852954 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.853009 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:50.852995263 +0000 UTC m=+33.894441830 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.853061 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:13:50.853055235 +0000 UTC m=+33.894501792 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.853119 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.853140 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:50.853134487 +0000 UTC m=+33.894581054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.864117 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.864324 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.864408 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.864470 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.864522 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.864574 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.904516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.904578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.904590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.904609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.904620 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:46Z","lastTransitionTime":"2025-12-04T12:13:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.953769 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:46 crc kubenswrapper[4760]: I1204 12:13:46.953844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.953980 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954003 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954017 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954070 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:50.954053575 +0000 UTC m=+33.995500142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954452 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954475 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954485 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:46 crc kubenswrapper[4760]: E1204 12:13:46.954515 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:50.954505889 +0000 UTC m=+33.995952456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.007176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.007250 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.007263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.007282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.007296 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.102943 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83" exitCode=0 Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.103046 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.105230 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b" exitCode=0 Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.105370 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.111563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.111600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.111613 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.111632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.111646 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.138414 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.155893 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.258995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.259056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.259068 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.259092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.259105 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.288769 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.409845 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.412480 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.412522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.412529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.412543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.412554 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.439915 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.455518 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.476522 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.492184 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.517101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.517138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.517149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.517171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.517182 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.619530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.620198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.620240 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.620257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.620268 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.707009 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.723246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.723271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.723279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.723295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.723304 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.825818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.825850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.825859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.825876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.825888 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:47Z","lastTransitionTime":"2025-12-04T12:13:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.837104 4760 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 04 12:13:47 crc kubenswrapper[4760]: I1204 12:13:47.839743 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/network-operator-58b4c7f79c-55gtf/status\": read tcp 38.102.83.107:59662->38.102.83.107:6443: use of closed network connection" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.090968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.091404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.091413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.091429 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.091439 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.116540 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.127011 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.127071 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.135788 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerStarted","Data":"2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.150341 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194046 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.194416 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.211385 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.234588 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.255809 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.274878 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.297181 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.298051 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.298092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.298151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.298185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.298721 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.321606 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.339521 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.362156 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.379052 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.394533 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.403602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.403646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.403679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.403705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.403720 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.413345 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.425958 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.445282 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.466652 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.481563 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.501393 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.506278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.506338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.506350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.506373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.506388 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.520047 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.538452 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.579246 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.609622 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.609700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.609711 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.609729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.609742 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.612346 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.625979 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.642134 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.659399 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.674765 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.692063 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.713576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.713643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.713653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.713675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.713687 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.723597 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.743370 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.759431 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.774996 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.816031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.816059 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.816066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.816080 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.816088 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.863382 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.863512 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:48 crc kubenswrapper[4760]: E1204 12:13:48.863624 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.863678 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:48 crc kubenswrapper[4760]: E1204 12:13:48.863792 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:48 crc kubenswrapper[4760]: E1204 12:13:48.863901 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.919273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.919344 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.919366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.919392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:48 crc kubenswrapper[4760]: I1204 12:13:48.919406 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:48Z","lastTransitionTime":"2025-12-04T12:13:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.022499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.022568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.022588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.022616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.022630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.125612 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.125923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.125938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.125959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.125971 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.142257 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.386642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.386707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.386730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.386752 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.386770 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.490507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.490573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.490586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.490610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.490623 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.594129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.594194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.594226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.594253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.594269 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.729653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.729720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.729733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.729752 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.729824 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.832038 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.832076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.832086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.832104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.832118 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.963971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.964013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.964024 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.964041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:49 crc kubenswrapper[4760]: I1204 12:13:49.964053 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:49Z","lastTransitionTime":"2025-12-04T12:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.120175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.120277 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.120294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.120319 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.120331 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.166960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.167019 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.167031 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.223467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.223497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.223505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.223522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.223535 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.326008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.326061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.326074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.326098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.326113 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.428004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.428031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.428039 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.428052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.428061 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.530742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.530790 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.530800 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.530816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.530829 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.633006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.633042 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.633053 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.633069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.633082 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.735401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.735450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.735461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.735476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.735486 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.787649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.787684 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.787698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.787715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.787726 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.802663 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:50Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.809013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.809085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.809102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.809130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.809143 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.823415 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:50Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.827276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.827316 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.827328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.827345 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.827360 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.841742 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:50Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.845857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.845904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.845920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.845939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.845953 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.859688 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:50Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863174 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863200 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.863274 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.863399 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.863505 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863771 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.863800 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.875060 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:50Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.875183 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.876832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.876891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.876903 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.876926 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.876947 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.951632 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.951784 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.951870 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:13:58.951835086 +0000 UTC m=+41.993281653 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.951887 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.951946 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:58.951929179 +0000 UTC m=+41.993375836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.951976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.952075 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:50 crc kubenswrapper[4760]: E1204 12:13:50.952113 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:58.952102835 +0000 UTC m=+41.993549492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.980188 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.980237 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.980246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.980261 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:50 crc kubenswrapper[4760]: I1204 12:13:50.980271 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:50Z","lastTransitionTime":"2025-12-04T12:13:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.053412 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.053465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053580 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053594 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053605 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053624 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053653 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:59.053640443 +0000 UTC m=+42.095087010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053658 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053670 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:51 crc kubenswrapper[4760]: E1204 12:13:51.053763 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:13:59.053746666 +0000 UTC m=+42.095193223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.082980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.083015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.083023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.083037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.083046 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.171369 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.173076 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a" exitCode=0 Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.173137 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.185328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.185410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.185424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.185444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.185951 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.191858 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.206617 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.220480 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.233600 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.247963 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.264341 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.280807 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.288840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.288912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.288928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.288949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.288964 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.295075 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.308800 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.327721 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.342706 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.357422 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.374092 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.391525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.391576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.391588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.391610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.391624 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.393276 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.398181 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2cfhd"] Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.398528 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.401355 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.401524 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.401600 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.401679 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.413162 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.424981 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.437088 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.449466 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.457555 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6e89efa-dd42-4b9a-9884-1a870b916762-host\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.457612 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6e89efa-dd42-4b9a-9884-1a870b916762-serviceca\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.457630 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgrcc\" (UniqueName: \"kubernetes.io/projected/a6e89efa-dd42-4b9a-9884-1a870b916762-kube-api-access-zgrcc\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.462203 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.477935 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494232 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494190 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.494436 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.508889 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.521351 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.533454 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.557632 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.559919 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgrcc\" (UniqueName: \"kubernetes.io/projected/a6e89efa-dd42-4b9a-9884-1a870b916762-kube-api-access-zgrcc\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.560023 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6e89efa-dd42-4b9a-9884-1a870b916762-host\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.560068 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6e89efa-dd42-4b9a-9884-1a870b916762-serviceca\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.561126 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6e89efa-dd42-4b9a-9884-1a870b916762-serviceca\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.561148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6e89efa-dd42-4b9a-9884-1a870b916762-host\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.573438 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.580350 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgrcc\" (UniqueName: \"kubernetes.io/projected/a6e89efa-dd42-4b9a-9884-1a870b916762-kube-api-access-zgrcc\") pod \"node-ca-2cfhd\" (UID: \"a6e89efa-dd42-4b9a-9884-1a870b916762\") " pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.589787 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.597857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.597909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.597920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.597937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.597950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.608833 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.623271 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.638742 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.650982 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.665319 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.677946 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.692770 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.700925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.700968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.700979 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.700995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.701006 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.707646 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.712907 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2cfhd" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.720448 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: W1204 12:13:51.730559 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e89efa_dd42_4b9a_9884_1a870b916762.slice/crio-e39a47dd2f66472130e74f0ebaad8493b324e6569e402f45c2c44255ec8df30b WatchSource:0}: Error finding container e39a47dd2f66472130e74f0ebaad8493b324e6569e402f45c2c44255ec8df30b: Status 404 returned error can't find the container with id e39a47dd2f66472130e74f0ebaad8493b324e6569e402f45c2c44255ec8df30b Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.732689 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.751198 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.773708 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.796653 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.804441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.804497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.804510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.804529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.804541 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.819253 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.840646 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.859651 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:51Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.908032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.908073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.908082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.908097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:51 crc kubenswrapper[4760]: I1204 12:13:51.908107 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:51Z","lastTransitionTime":"2025-12-04T12:13:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.010712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.010761 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.010772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.010789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.010800 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.113249 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.113311 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.113327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.113349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.113364 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.178086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2cfhd" event={"ID":"a6e89efa-dd42-4b9a-9884-1a870b916762","Type":"ContainerStarted","Data":"e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.178172 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2cfhd" event={"ID":"a6e89efa-dd42-4b9a-9884-1a870b916762","Type":"ContainerStarted","Data":"e39a47dd2f66472130e74f0ebaad8493b324e6569e402f45c2c44255ec8df30b"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.186508 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.190909 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13" exitCode=0 Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.191704 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.196882 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.215686 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.215721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.215733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.215751 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.215762 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.222652 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.236530 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.249597 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.269115 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.280636 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.294247 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.305914 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.314456 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.317770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.317817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.317828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.317844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.317856 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.328023 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.343606 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.353506 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.366184 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.378529 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.395181 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.413315 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.419821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.419858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.419870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.419887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.419897 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.427400 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.439112 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.454305 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.472744 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.488924 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.502195 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.513922 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.523085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.523138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.523149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.523168 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.523181 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.527282 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.538136 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.549179 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.562017 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.574892 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.589351 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.605038 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.626295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.626337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.626352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.626367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.626380 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.728744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.728785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.728796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.728813 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.728824 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.831695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.831736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.831749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.831765 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.831774 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.863625 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.863625 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:52 crc kubenswrapper[4760]: E1204 12:13:52.863779 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:52 crc kubenswrapper[4760]: E1204 12:13:52.863833 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.863658 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:52 crc kubenswrapper[4760]: E1204 12:13:52.864161 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.934114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.934154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.934165 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.934185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:52 crc kubenswrapper[4760]: I1204 12:13:52.934200 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:52Z","lastTransitionTime":"2025-12-04T12:13:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.037323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.037372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.037383 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.037401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.037413 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.140619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.141365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.141395 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.141422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.141439 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.196598 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47" exitCode=0 Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.196642 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.215652 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.227097 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.241999 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.244404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.244447 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.244459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.244478 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.244489 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.257134 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.270154 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.284492 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.300361 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.313815 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.329012 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.346203 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.348331 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.348411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.348423 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.348440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.348452 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.360084 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.380790 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.398823 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.411914 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.427729 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:53Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.458501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.458580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.458597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.458628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.458646 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.561637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.561690 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.561699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.561715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.561723 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.664408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.664491 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.664500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.664534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.664547 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.779148 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.779298 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.779315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.779377 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.779391 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.882101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.882147 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.882159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.882179 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.882190 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.984389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.984426 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.984437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.984455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:53 crc kubenswrapper[4760]: I1204 12:13:53.984465 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:53Z","lastTransitionTime":"2025-12-04T12:13:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.087259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.087298 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.087309 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.087325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.087335 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.189521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.189551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.189558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.189571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.189579 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.201342 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerStarted","Data":"1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.215896 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.301524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.301574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.301585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.301601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.301613 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.311324 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.325664 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.341547 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.350893 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.362259 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.374988 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.388227 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407760 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.407776 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.420383 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.444962 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.456762 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.470124 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.485432 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.506122 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:54Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.509814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.509853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.509865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.509882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.509893 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.676349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.676391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.676402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.676421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.676434 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.801401 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.801858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.801871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.802061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.802080 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.863338 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.863368 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.863338 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:54 crc kubenswrapper[4760]: E1204 12:13:54.863475 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:54 crc kubenswrapper[4760]: E1204 12:13:54.863587 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:54 crc kubenswrapper[4760]: E1204 12:13:54.863665 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.904110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.904154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.904166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.904181 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:54 crc kubenswrapper[4760]: I1204 12:13:54.904191 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:54Z","lastTransitionTime":"2025-12-04T12:13:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.005882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.005904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.005913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.005927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.005937 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.108233 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.108277 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.108286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.108308 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.108324 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212295 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.212974 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.213978 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.214047 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.365547 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.367440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.367481 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.367494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.367512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.367523 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.388411 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.405044 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.438097 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.454973 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.470074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.470126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.470138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.470153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.470163 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.473461 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.495713 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.508206 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.520488 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.536981 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.549826 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.565861 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.577055 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.577091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.577101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.577114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.577123 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.581652 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.598551 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.616137 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.630464 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.644188 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.659092 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.676067 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.680083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.680117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.680127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.680145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.680155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.697069 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.715203 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.728989 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.751689 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.766690 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.782115 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.783325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.783444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.783527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.783614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.783715 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.801681 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.822701 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.838133 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.850902 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.864702 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.877915 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.885915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.885951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.885961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.885974 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.885983 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.890962 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:55Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.989084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.989121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.989133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.989149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:55 crc kubenswrapper[4760]: I1204 12:13:55.989160 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:55Z","lastTransitionTime":"2025-12-04T12:13:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.091020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.091057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.091067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.091083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.091094 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.194107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.194173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.194183 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.194202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.194246 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.220291 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43" exitCode=0 Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.220364 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.220448 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.243718 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.260613 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.275994 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.291397 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.296358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.296381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.296390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.296406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.296415 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.302800 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.317341 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.332877 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.399461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.399506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.399516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.399530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.399540 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.407570 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.425994 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.438319 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.462400 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.479848 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.493907 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.502328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.502357 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.502365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.502380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.502389 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.509493 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.530788 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:56Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.605264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.605311 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.605322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.605339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.605349 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.708199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.708259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.708269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.708286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.708295 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.810964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.811025 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.811036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.811055 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.811069 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.863692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.863696 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.864285 4760 scope.go:117] "RemoveContainer" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.864449 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:56 crc kubenswrapper[4760]: E1204 12:13:56.864600 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:56 crc kubenswrapper[4760]: E1204 12:13:56.864679 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:56 crc kubenswrapper[4760]: E1204 12:13:56.864764 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.913515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.913560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.913572 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.913591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:56 crc kubenswrapper[4760]: I1204 12:13:56.913601 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:56Z","lastTransitionTime":"2025-12-04T12:13:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.017255 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.017586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.017609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.017629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.017640 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.119779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.119839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.119851 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.119870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.119880 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.222127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.222235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.222252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.222276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.222291 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.232821 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc874ce0-7f43-4ba9-921a-dd8141d738a1" containerID="ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876" exitCode=0 Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.233009 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.233356 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerDied","Data":"ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.256900 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.278059 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.293482 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.306291 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.321598 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.326283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.326369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.326417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.326439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.326452 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.335570 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.349049 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.365758 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.382468 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.398999 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.413306 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.428829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.428881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.428894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.428915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.428930 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.437373 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.454293 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.469549 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.487107 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.541308 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.541363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.541378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.541396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.541407 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.645923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.645962 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.645973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.645991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.646002 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.748459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.748496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.748507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.748524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.748534 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.851585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.851627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.851639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.851657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.851671 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.882971 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.901478 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.920090 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.938424 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.952642 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.954580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.954641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.954652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.954672 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.954685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:57Z","lastTransitionTime":"2025-12-04T12:13:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.970630 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:57 crc kubenswrapper[4760]: I1204 12:13:57.988533 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.008227 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.025414 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.042285 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.059485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.059583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.059595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.059615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.059627 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.070671 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.085725 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.100593 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.120928 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.152675 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.161680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.161753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.161768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.161791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.161806 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.238935 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.241302 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.241827 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.249878 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" event={"ID":"bc874ce0-7f43-4ba9-921a-dd8141d738a1","Type":"ContainerStarted","Data":"e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.262329 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.264373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.264427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.264438 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.264459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.264473 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.281272 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.298730 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.315680 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.328640 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.353231 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.369013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.369062 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.369074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.369097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.369110 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.383259 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.402865 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.438032 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.461249 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.473066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.473117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.473136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.473161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.473181 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.482886 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.499317 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.512895 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.532918 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.547283 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.567131 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.576517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.576542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.576550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.576563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.576572 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.586589 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.619420 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.652527 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.668033 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.679769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.679812 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.679821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.679840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.679851 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.684608 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.699665 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.713330 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.765459 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.824992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.825066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.825081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.825105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.825119 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.841902 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg"] Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.843124 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.844194 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.846783 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.847131 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.864137 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.864474 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.864604 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:58 crc kubenswrapper[4760]: E1204 12:13:58.864705 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:13:58 crc kubenswrapper[4760]: E1204 12:13:58.864855 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.864529 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:58 crc kubenswrapper[4760]: E1204 12:13:58.866662 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.893256 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.909936 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.923166 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.923728 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb88n\" (UniqueName: \"kubernetes.io/projected/345f593d-ac28-4bf4-aed0-adbad7c3a90e-kube-api-access-jb88n\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.923856 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.923953 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.924029 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.927978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.928137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.928199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.928310 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.928380 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:58Z","lastTransitionTime":"2025-12-04T12:13:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.939790 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.954416 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.972691 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:58 crc kubenswrapper[4760]: I1204 12:13:58.988096 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.004313 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.022251 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.024901 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.025084 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.025057874 +0000 UTC m=+58.066504441 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025150 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb88n\" (UniqueName: \"kubernetes.io/projected/345f593d-ac28-4bf4-aed0-adbad7c3a90e-kube-api-access-jb88n\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025224 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025267 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025329 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.025367 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.025409 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.025518 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.025494269 +0000 UTC m=+58.066941016 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.025523 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.025595 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.025574502 +0000 UTC m=+58.067021069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.026197 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.026368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.031020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.031071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.031091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.031116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.031130 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.032807 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/345f593d-ac28-4bf4-aed0-adbad7c3a90e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.043903 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb88n\" (UniqueName: \"kubernetes.io/projected/345f593d-ac28-4bf4-aed0-adbad7c3a90e-kube-api-access-jb88n\") pod \"ovnkube-control-plane-749d76644c-4jxpg\" (UID: \"345f593d-ac28-4bf4-aed0-adbad7c3a90e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.049715 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.069665 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.095866 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.110623 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.125694 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.125792 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.125947 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.125965 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.125977 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.126029 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.126014923 +0000 UTC m=+58.167461490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.126061 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.126123 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.126141 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:59 crc kubenswrapper[4760]: E1204 12:13:59.126249 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.12622216 +0000 UTC m=+58.167668727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.127730 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.139755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.139833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.139844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.139864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.139878 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.152194 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.164652 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.182637 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.198539 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.211232 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243099 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.243576 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.257682 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" event={"ID":"345f593d-ac28-4bf4-aed0-adbad7c3a90e","Type":"ContainerStarted","Data":"b8db116ccf711e6b151f699e7badc01c3b4d1e920672346812015f4ae7b3738d"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.274391 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:13:59Z is after 2025-08-24T17:21:41Z" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.346032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.346084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.346096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.346116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.346130 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.449501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.449540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.449549 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.449564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.449572 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.551499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.551548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.551560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.551578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.551587 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.656904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.656960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.656973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.656992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.657004 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.759419 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.759476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.759487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.759506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.759516 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.861459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.861495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.861503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.861517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.861527 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.964509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.964547 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.964557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.964571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:13:59 crc kubenswrapper[4760]: I1204 12:13:59.964582 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:13:59Z","lastTransitionTime":"2025-12-04T12:13:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.044136 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xpngr"] Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.045132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.045227 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.061794 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.066862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.067093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.067106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.067123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.067135 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.077386 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.092001 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.105627 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.117418 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.128328 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.133762 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.133814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlzn\" (UniqueName: \"kubernetes.io/projected/b4fd6a47-556a-4236-9f60-0e7996e4608a-kube-api-access-6dlzn\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.139186 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.150225 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170138 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.170474 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.183467 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.210964 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.232118 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.234643 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.234691 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlzn\" (UniqueName: \"kubernetes.io/projected/b4fd6a47-556a-4236-9f60-0e7996e4608a-kube-api-access-6dlzn\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.234798 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.234891 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:00.734869673 +0000 UTC m=+43.776316310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.247950 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.252895 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlzn\" (UniqueName: \"kubernetes.io/projected/b4fd6a47-556a-4236-9f60-0e7996e4608a-kube-api-access-6dlzn\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.263869 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.264952 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" event={"ID":"345f593d-ac28-4bf4-aed0-adbad7c3a90e","Type":"ContainerStarted","Data":"7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.265027 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" event={"ID":"345f593d-ac28-4bf4-aed0-adbad7c3a90e","Type":"ContainerStarted","Data":"f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.267854 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/0.log" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.271460 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf" exitCode=1 Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.271514 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.271955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.272008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.272020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.272040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.272055 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.272458 4760 scope.go:117] "RemoveContainer" containerID="af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.280437 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.297752 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.319927 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.336367 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.350832 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.366402 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378566 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.378946 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.403607 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.424981 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:13:59Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 12:13:59.554503 6013 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 12:13:59.553685 6013 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.553947 6013 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.555845 6013 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 12:13:59.555912 6013 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 12:13:59.556001 6013 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 12:13:59.556043 6013 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 12:13:59.556109 6013 factory.go:656] Stopping watch factory\\\\nI1204 12:13:59.556146 6013 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:13:59.556201 6013 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 12:13:59.556271 6013 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 12:13:59.556305 6013 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 12:13:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.438444 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.455933 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.469143 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.481512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.481561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.481573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.481593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.481607 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.485455 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.500297 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.517788 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.536943 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.552822 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.567090 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.583152 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.585041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.585094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.585106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.585125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.585136 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.600164 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:00Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.687620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.687689 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.687701 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.687726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.687741 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.762365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.762615 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.762769 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:01.762736251 +0000 UTC m=+44.804182888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.790093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.790145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.790156 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.790172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.790182 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.863413 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.863544 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.863645 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.863441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.863767 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:00 crc kubenswrapper[4760]: E1204 12:14:00.863884 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.893012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.893098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.893113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.893141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.893157 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.995966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.996021 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.996032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.996051 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:00 crc kubenswrapper[4760]: I1204 12:14:00.996067 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:00Z","lastTransitionTime":"2025-12-04T12:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.093388 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.093446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.093458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.093479 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.093490 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.108808 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.114394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.114721 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.114810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.114888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.114950 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.130385 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.135945 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.136162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.136257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.136367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.136442 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.153338 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.158927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.159021 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.159037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.159063 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.159078 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.177671 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.183789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.183859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.183871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.183891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.183902 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.201228 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.201349 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.203282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.203405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.203482 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.203560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.203627 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.277440 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/0.log" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.280739 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.281127 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.299321 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.311767 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.311826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.311841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.311866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.311878 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.320946 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.337790 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.354552 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.389112 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.401270 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.415137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.415241 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.415254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.415296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.415314 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.424556 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.438220 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.454920 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.469470 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.485296 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.507110 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:13:59Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 12:13:59.554503 6013 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 12:13:59.553685 6013 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.553947 6013 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.555845 6013 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 12:13:59.555912 6013 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 12:13:59.556001 6013 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 12:13:59.556043 6013 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 12:13:59.556109 6013 factory.go:656] Stopping watch factory\\\\nI1204 12:13:59.556146 6013 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:13:59.556201 6013 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 12:13:59.556271 6013 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 12:13:59.556305 6013 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 12:13:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.517961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.517995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.518006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.518022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.518034 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.524004 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.537747 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.549956 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.566088 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.620645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.620968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.620994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.621036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.621050 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.649131 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.723990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.724043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.724055 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.724074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.724085 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.772835 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.773036 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.773139 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:03.773118686 +0000 UTC m=+46.814565253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.827848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.828349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.828465 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.828564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.828656 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.863479 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:01 crc kubenswrapper[4760]: E1204 12:14:01.863847 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.934132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.934233 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.934254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.934283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:01 crc kubenswrapper[4760]: I1204 12:14:01.934302 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:01Z","lastTransitionTime":"2025-12-04T12:14:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.038000 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.038681 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.038770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.038854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.038943 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.141815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.141862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.141873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.141892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.141908 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.244773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.244821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.244837 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.244855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.244865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.290341 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/1.log" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.291735 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/0.log" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.294152 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc" exitCode=1 Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.294197 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.294304 4760 scope.go:117] "RemoveContainer" containerID="af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.295484 4760 scope.go:117] "RemoveContainer" containerID="1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc" Dec 04 12:14:02 crc kubenswrapper[4760]: E1204 12:14:02.295733 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.312242 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.326842 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.342676 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.351420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.351460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.351476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.351492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.351502 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.356381 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.367968 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.384621 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.397166 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.407549 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.417999 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.431546 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.447839 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.454318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.454355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.454365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.454382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.454393 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.464667 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.482616 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.495820 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.519360 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.531995 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.550977 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:13:59Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 12:13:59.554503 6013 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 12:13:59.553685 6013 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.553947 6013 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.555845 6013 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 12:13:59.555912 6013 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 12:13:59.556001 6013 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 12:13:59.556043 6013 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 12:13:59.556109 6013 factory.go:656] Stopping watch factory\\\\nI1204 12:13:59.556146 6013 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:13:59.556201 6013 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 12:13:59.556271 6013 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 12:13:59.556305 6013 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 12:13:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.557065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.557092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.557103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.557119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.557129 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.660225 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.660310 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.660325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.660380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.660397 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.763693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.763746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.763760 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.763778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.763791 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.863365 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.863469 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:02 crc kubenswrapper[4760]: E1204 12:14:02.863509 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.863599 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:02 crc kubenswrapper[4760]: E1204 12:14:02.863659 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:02 crc kubenswrapper[4760]: E1204 12:14:02.863761 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.866026 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.866087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.866106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.866128 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.866142 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.968532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.968571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.968586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.968604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:02 crc kubenswrapper[4760]: I1204 12:14:02.968616 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:02Z","lastTransitionTime":"2025-12-04T12:14:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.071400 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.071450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.071460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.071478 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.071488 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.174928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.174968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.174979 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.174996 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.175007 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.278472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.278520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.278535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.278552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.278561 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.299737 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/1.log" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.380556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.380595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.380606 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.380623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.380637 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.483323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.483377 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.483390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.483412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.483429 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.586392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.586439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.586449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.586471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.586482 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.689266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.689322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.689335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.689355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.689368 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.793173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.793265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.793288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.793308 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.793318 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.796102 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:03 crc kubenswrapper[4760]: E1204 12:14:03.796331 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:03 crc kubenswrapper[4760]: E1204 12:14:03.796453 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:07.796420271 +0000 UTC m=+50.837867028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.863870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:03 crc kubenswrapper[4760]: E1204 12:14:03.864085 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.897508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.897571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.897583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.897602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:03 crc kubenswrapper[4760]: I1204 12:14:03.897620 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:03Z","lastTransitionTime":"2025-12-04T12:14:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.000256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.000333 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.000343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.000358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.000368 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.103425 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.103484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.103497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.103518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.103531 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.206371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.206443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.206456 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.206479 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.206493 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.311955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.312538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.312596 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.312966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.312980 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.417106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.417161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.417174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.417196 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.417231 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.520897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.521247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.521352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.521455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.521557 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.625171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.625275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.625292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.625318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.625338 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.728315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.728370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.728381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.728400 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.728413 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.832542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.832586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.832596 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.832616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.832629 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.863537 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.863601 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.863704 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:04 crc kubenswrapper[4760]: E1204 12:14:04.863890 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:04 crc kubenswrapper[4760]: E1204 12:14:04.864099 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:04 crc kubenswrapper[4760]: E1204 12:14:04.864166 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.935945 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.935994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.936007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.936027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:04 crc kubenswrapper[4760]: I1204 12:14:04.936043 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:04Z","lastTransitionTime":"2025-12-04T12:14:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.039016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.039101 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.039120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.039147 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.039165 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.142150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.142224 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.142239 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.142257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.142276 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.208762 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.219103 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.222463 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.240396 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.244939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.244981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.244991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.245005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.245015 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.258003 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.271800 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.285881 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.299335 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.321205 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.336495 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.348257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.348292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.348306 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.348322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.348332 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.351953 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.370153 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.383429 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.405342 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:13:59Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 12:13:59.554503 6013 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 12:13:59.553685 6013 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.553947 6013 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.555845 6013 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 12:13:59.555912 6013 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 12:13:59.556001 6013 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 12:13:59.556043 6013 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 12:13:59.556109 6013 factory.go:656] Stopping watch factory\\\\nI1204 12:13:59.556146 6013 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:13:59.556201 6013 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 12:13:59.556271 6013 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 12:13:59.556305 6013 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 12:13:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.420479 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.434844 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.446897 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.455040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.455195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.455621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.455652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.455668 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.460765 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.474904 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:05Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.558555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.558592 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.558602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.558619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.558630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.661657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.661720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.661732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.661750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.661760 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.766378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.766437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.766453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.766476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.766492 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.863298 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:05 crc kubenswrapper[4760]: E1204 12:14:05.863456 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.870052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.870095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.870106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.870123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.870139 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.973589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.973645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.973658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.973679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:05 crc kubenswrapper[4760]: I1204 12:14:05.973691 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:05Z","lastTransitionTime":"2025-12-04T12:14:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.076600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.076912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.076988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.077087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.077172 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.179704 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.179954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.180028 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.180122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.180271 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.282731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.282773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.282789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.282838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.282849 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.384916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.384957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.384972 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.384988 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.384998 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.488343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.488392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.488413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.488435 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.488448 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.591409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.591464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.591475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.591497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.591508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.693866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.693917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.693925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.693943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.693953 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.796508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.796560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.796582 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.796602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.796616 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.863304 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:06 crc kubenswrapper[4760]: E1204 12:14:06.863472 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.863542 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:06 crc kubenswrapper[4760]: E1204 12:14:06.863619 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.863683 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:06 crc kubenswrapper[4760]: E1204 12:14:06.863844 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.898774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.898828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.898869 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.898887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:06 crc kubenswrapper[4760]: I1204 12:14:06.898903 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:06Z","lastTransitionTime":"2025-12-04T12:14:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.001282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.001328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.001340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.001356 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.001367 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.104083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.104138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.104149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.104165 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.104177 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.207384 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.207443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.207455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.207473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.207484 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.311018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.311071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.311087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.311111 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.311123 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.413514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.413570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.413579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.413597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.413614 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.515565 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.515670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.515683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.515701 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.515716 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.618780 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.618858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.618874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.618934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.618984 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.721933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.721970 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.721978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.721992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.722002 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.824406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.824634 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.824650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.824668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.824681 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.842483 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:07 crc kubenswrapper[4760]: E1204 12:14:07.842622 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:07 crc kubenswrapper[4760]: E1204 12:14:07.842674 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:15.842660277 +0000 UTC m=+58.884106844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.864620 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:07 crc kubenswrapper[4760]: E1204 12:14:07.864787 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.890288 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af12b3bbf9031a1d87a581eede70d77641834a48a29285dc3306c0394d5e7fbf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:13:59Z\\\",\\\"message\\\":\\\"om/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 12:13:59.554503 6013 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 12:13:59.553685 6013 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.553947 6013 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 12:13:59.555845 6013 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 12:13:59.555912 6013 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 12:13:59.556001 6013 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 12:13:59.556043 6013 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1204 12:13:59.556109 6013 factory.go:656] Stopping watch factory\\\\nI1204 12:13:59.556146 6013 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:13:59.556201 6013 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 12:13:59.556271 6013 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 12:13:59.556305 6013 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 12:13:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.908358 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.925056 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.927170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.927251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.927264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.927284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.927296 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:07Z","lastTransitionTime":"2025-12-04T12:14:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.939785 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.959450 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.969948 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.983389 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:07 crc kubenswrapper[4760]: I1204 12:14:07.997962 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:07Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.015721 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.031673 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.031719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.031731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.031749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.031762 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.032734 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.047932 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.061046 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.073478 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.094077 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.109912 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.124794 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.134246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.134277 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.134285 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.134299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.134308 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.139962 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.153494 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:08Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.236853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.236899 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.236908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.236921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.236931 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.340279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.340324 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.340334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.340348 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.340356 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.443180 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.443248 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.443260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.443279 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.443288 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.545591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.545638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.545649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.545666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.545678 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.648476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.648521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.648529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.648582 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.648591 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.751259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.751303 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.751314 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.751335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.751346 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.853779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.853830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.853841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.853855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.853865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.864567 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:08 crc kubenswrapper[4760]: E1204 12:14:08.864700 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.864567 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.864567 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:08 crc kubenswrapper[4760]: E1204 12:14:08.864880 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:08 crc kubenswrapper[4760]: E1204 12:14:08.864916 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.956562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.956643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.956653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.956669 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:08 crc kubenswrapper[4760]: I1204 12:14:08.956678 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:08Z","lastTransitionTime":"2025-12-04T12:14:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.060636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.060712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.060729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.060755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.060776 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.163857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.163913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.163927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.163947 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.163964 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.266736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.266781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.266793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.266809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.266819 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.370266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.370321 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.370332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.370351 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.370363 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.473075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.473129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.473142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.473160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.473172 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.587752 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.587791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.587803 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.587821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.587833 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.691276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.691325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.691336 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.691353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.691364 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.794012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.794066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.794077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.794094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.794105 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.864087 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:09 crc kubenswrapper[4760]: E1204 12:14:09.864283 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.897018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.897069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.897085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.897102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.897124 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.999293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.999327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.999345 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.999364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:09 crc kubenswrapper[4760]: I1204 12:14:09.999375 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:09Z","lastTransitionTime":"2025-12-04T12:14:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.101422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.101462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.101475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.101490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.101500 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.203512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.203544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.203555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.203571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.203581 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.306692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.306792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.306806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.306826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.307031 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.410486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.410560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.410574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.410604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.410615 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.513684 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.513714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.513725 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.513740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.513750 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.616852 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.616920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.616942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.616963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.616974 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.719992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.720051 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.720067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.720091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.720107 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.823369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.823430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.823443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.823462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.823476 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.863482 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.863518 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:10 crc kubenswrapper[4760]: E1204 12:14:10.863704 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.863818 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:10 crc kubenswrapper[4760]: E1204 12:14:10.863979 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:10 crc kubenswrapper[4760]: E1204 12:14:10.864135 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.925692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.925766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.925778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.925824 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:10 crc kubenswrapper[4760]: I1204 12:14:10.925838 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:10Z","lastTransitionTime":"2025-12-04T12:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.029838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.029871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.029880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.029895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.029905 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.133048 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.133084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.133095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.133111 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.133123 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.235468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.235496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.235504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.235518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.235528 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.337496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.337528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.337540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.337556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.337567 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.440112 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.440162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.440173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.440190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.440202 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.542338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.542367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.542376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.542389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.542398 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.579169 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.579259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.579268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.579283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.579293 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.592300 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.596267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.596313 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.596325 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.596343 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.596354 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.609370 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.613701 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.613755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.613766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.613785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.613797 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.626886 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.630675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.630729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.630739 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.630758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.630768 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.644420 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.648417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.648449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.648459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.648475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.648488 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.660993 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:11Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.661135 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.662936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.662971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.662980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.662996 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.663006 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.766119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.766164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.766176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.766193 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.766221 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.863690 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:11 crc kubenswrapper[4760]: E1204 12:14:11.863969 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.867954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.867995 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.868004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.868021 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.868031 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.970909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.970949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.970958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.970973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:11 crc kubenswrapper[4760]: I1204 12:14:11.970983 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:11Z","lastTransitionTime":"2025-12-04T12:14:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.073523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.073919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.074087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.074192 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.074309 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.176710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.176761 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.176775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.176793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.176805 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.278840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.278919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.278931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.278953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.278966 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.382183 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.382256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.382266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.382285 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.382297 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.484477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.484526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.484552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.484569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.484579 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.586702 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.586737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.586745 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.586759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.586768 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.689858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.689903 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.689915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.689932 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.689944 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.792955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.793006 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.793019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.793038 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.793049 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.864193 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.864237 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.864360 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:12 crc kubenswrapper[4760]: E1204 12:14:12.864514 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:12 crc kubenswrapper[4760]: E1204 12:14:12.864670 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:12 crc kubenswrapper[4760]: E1204 12:14:12.864758 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.895026 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.895083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.895094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.895110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.895121 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.998070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.998119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.998129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.998142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:12 crc kubenswrapper[4760]: I1204 12:14:12.998151 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:12Z","lastTransitionTime":"2025-12-04T12:14:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.101191 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.101314 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.101331 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.101355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.101370 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.204451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.204501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.204510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.204533 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.204544 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.307532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.307614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.307630 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.307653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.307743 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.331978 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.332832 4760 scope.go:117] "RemoveContainer" containerID="1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.348859 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.365022 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.382980 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.399334 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.410977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.411023 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.411035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.411052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.411062 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.413871 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.429707 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.446741 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.460851 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.473974 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.490103 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.507038 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.515535 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.515588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.515600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.515619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.515632 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.524854 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.541251 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.561178 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.575902 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.598316 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.613543 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.619843 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.619901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.619915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.619949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.619962 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.635378 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:13Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.722875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.722909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.722918 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.722934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.722944 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.824776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.824812 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.824824 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.824840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.824850 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.864748 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:13 crc kubenswrapper[4760]: E1204 12:14:13.864937 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.942591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.942664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.942678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.942697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:13 crc kubenswrapper[4760]: I1204 12:14:13.942709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:13Z","lastTransitionTime":"2025-12-04T12:14:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.067520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.067550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.067559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.067573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.067582 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.170294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.170373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.170385 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.170405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.170417 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.272763 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.272823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.272833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.272851 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.272865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.354301 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/1.log" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.357485 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.358038 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.371277 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.375259 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.375307 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.375323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.375342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.375355 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.383760 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.397917 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.410814 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.425082 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.437376 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.450654 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.461446 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.478294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.478328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.478339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.478353 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.478364 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.480389 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.492974 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.504398 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.521195 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.542405 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.558137 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.571662 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.581202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.581273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.581282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.581299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.581308 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.584708 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.598471 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.612890 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:14Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.683891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.683934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.683943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.683959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.683968 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.785712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.785771 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.785783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.785799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.785810 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.863916 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.863958 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:14 crc kubenswrapper[4760]: E1204 12:14:14.864055 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.864136 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:14 crc kubenswrapper[4760]: E1204 12:14:14.864314 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:14 crc kubenswrapper[4760]: E1204 12:14:14.864354 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.888968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.889019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.889029 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.889053 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.889065 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.991775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.991822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.991833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.991850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:14 crc kubenswrapper[4760]: I1204 12:14:14.991861 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:14Z","lastTransitionTime":"2025-12-04T12:14:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.047499 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.047610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.047659 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:14:47.047627142 +0000 UTC m=+90.089073699 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.047691 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.047754 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.047899 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:47.047885191 +0000 UTC m=+90.089331858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.047958 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.048038 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:47.048019685 +0000 UTC m=+90.089466282 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.093840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.093882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.093891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.093908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.093920 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.148927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.148973 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149094 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149110 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149121 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149130 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149169 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:47.149155599 +0000 UTC m=+90.190602166 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149177 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149194 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.149276 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:14:47.149255553 +0000 UTC m=+90.190702180 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.195910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.195964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.195982 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.196003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.196013 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.299788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.299855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.299870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.299894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.299910 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.364647 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/2.log" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.368588 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/1.log" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.378254 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" exitCode=1 Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.378420 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.378486 4760 scope.go:117] "RemoveContainer" containerID="1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.379147 4760 scope.go:117] "RemoveContainer" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.379346 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402603 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.402540 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.416917 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.429657 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.444609 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.460161 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.473537 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.489931 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.504231 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.505562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.505600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.505607 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.505621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.505630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.516796 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.528663 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.542453 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.557038 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.569553 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.582488 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.599152 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.600353 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.608087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.608139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.608150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.608342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.608360 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.612801 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.635379 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.649433 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.662568 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.673609 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.686972 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.701825 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.710231 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.710272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.710283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.710303 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.710315 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.716248 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.728625 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.741318 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.753613 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.767302 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.782044 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.794933 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.809435 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.813043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.813073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.813083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.813096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.813105 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.822992 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.838613 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.850313 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.857100 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.857277 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.857372 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:14:31.857352791 +0000 UTC m=+74.898799418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.863757 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:15 crc kubenswrapper[4760]: E1204 12:14:15.863910 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.870233 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.882326 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.906403 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1137e88bcb92988dee68d9d3c1a59e685088725e531bdd6f1411cf1b959585fc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:01Z\\\",\\\"message\\\":\\\".893232 6260 obj_retry.go:409] Going to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-xpngr]\\\\nI1204 12:14:01.893197 6260 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 12:14:01.893255 6260 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name pod/openshift-console/downloads-7954f5f757-dn2jn. OVN-Kubernetes controller took 3.2591e-05 seconds. No OVN measurement.\\\\nI1204 12:14:01.893243 6260 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1204 12:14:01.893284 6260 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-xpngr before timer (time: 2025-12-04 12:14:03.032873795 +0000 UTC m=+2.499100890): skip\\\\nI1204 12:14:01.893311 6260 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 80.323µs)\\\\nI1204 12:14:01.893326 6260 factory.go:656] Stopping watch factory\\\\nI1204 12:14:01.893348 6260 ovnkube.go:599] Stopped ovnkube\\\\nI1204 12:14:01.893395 6260 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 12:14:01.893424 6260 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1204 12:14:01.893589 6260 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:15Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.916074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.916148 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.916159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.916174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:15 crc kubenswrapper[4760]: I1204 12:14:15.916185 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:15Z","lastTransitionTime":"2025-12-04T12:14:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.018814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.018857 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.018868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.018884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.018895 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.121879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.121917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.121928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.121946 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.121959 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.224591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.224632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.224641 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.224658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.224669 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.326668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.326707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.326718 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.326737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.326747 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.383180 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/2.log" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.386391 4760 scope.go:117] "RemoveContainer" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" Dec 04 12:14:16 crc kubenswrapper[4760]: E1204 12:14:16.386536 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.409859 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.422980 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.428871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.428918 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.428953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.428971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.428980 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.436950 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.451471 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.467295 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.480186 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.495121 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.512591 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.531195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.531260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.531271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.531288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.531300 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.533698 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.548677 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.564558 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.578441 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.591921 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.605152 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.629662 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.633545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.633570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.633578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.633591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.633599 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.643408 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.654863 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.669380 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:16Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.736369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.736409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.736420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.736437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.736454 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.839129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.839163 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.839173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.839186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.839196 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.863752 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:16 crc kubenswrapper[4760]: E1204 12:14:16.863914 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.863962 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.864011 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:16 crc kubenswrapper[4760]: E1204 12:14:16.864188 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:16 crc kubenswrapper[4760]: E1204 12:14:16.864380 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.941923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.941964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.941975 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.941992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:16 crc kubenswrapper[4760]: I1204 12:14:16.942005 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:16Z","lastTransitionTime":"2025-12-04T12:14:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.044978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.045031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.045049 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.045075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.045087 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.147615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.147654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.147665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.147682 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.147696 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.258552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.258594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.258604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.258620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.258630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.361633 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.361703 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.361715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.361733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.361744 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.464452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.464476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.464484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.464499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.464508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.572253 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.572299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.572311 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.572328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.572339 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.675237 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.675289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.675301 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.675320 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.675331 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.777460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.777503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.777515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.777529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.777538 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.864030 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:17 crc kubenswrapper[4760]: E1204 12:14:17.864184 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.878050 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.880503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.880544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.880555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.880572 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.880583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.900966 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.917117 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.933316 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.953556 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.974503 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.982186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.982252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.982268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.982290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.982305 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:17Z","lastTransitionTime":"2025-12-04T12:14:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:17 crc kubenswrapper[4760]: I1204 12:14:17.989185 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:17Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.002647 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.013987 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.028455 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.038860 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.049973 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.064725 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.079240 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.088757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.088804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.088815 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.088833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.088845 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.093443 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.107950 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.120963 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.131743 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:18Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.192080 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.192130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.192140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.192157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.192167 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.294565 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.294616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.294627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.294647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.294659 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.397186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.397522 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.397531 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.397549 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.397562 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.500555 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.500608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.500627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.500652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.500666 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.603720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.604793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.604884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.604982 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.605071 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.708076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.708113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.708127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.708145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.708155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.810552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.810602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.810617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.810635 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.810649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.863443 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:18 crc kubenswrapper[4760]: E1204 12:14:18.863611 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.863462 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:18 crc kubenswrapper[4760]: E1204 12:14:18.863766 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.863454 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:18 crc kubenswrapper[4760]: E1204 12:14:18.863835 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.913275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.913335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.913350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.913368 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:18 crc kubenswrapper[4760]: I1204 12:14:18.913380 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:18Z","lastTransitionTime":"2025-12-04T12:14:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.016186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.016518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.016604 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.016681 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.016749 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.119449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.119484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.119492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.119507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.119517 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.222194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.222246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.222256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.222270 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.222279 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.325007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.325066 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.325079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.325096 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.325107 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.428120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.428171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.428184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.428227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.428241 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.531010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.531053 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.531065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.531081 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.531090 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.633796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.633936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.633949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.633977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.633991 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.737671 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.737734 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.737749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.737766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.737778 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.841341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.841421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.841438 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.841466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.841482 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.863901 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:19 crc kubenswrapper[4760]: E1204 12:14:19.864853 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.945152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.945202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.945235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.945256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:19 crc kubenswrapper[4760]: I1204 12:14:19.945269 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:19Z","lastTransitionTime":"2025-12-04T12:14:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.047413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.047466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.047481 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.047501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.047511 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.150026 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.150063 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.150072 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.150086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.150094 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.253473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.253552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.253568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.253595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.253613 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.356716 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.356772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.356793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.356830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.356852 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.459724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.459833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.459849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.459870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.459883 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.561993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.562040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.562053 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.562071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.562083 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.664404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.664439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.664446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.664460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.664471 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.766507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.766563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.766572 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.766586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.766595 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.863929 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.863974 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.864060 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:20 crc kubenswrapper[4760]: E1204 12:14:20.864115 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:20 crc kubenswrapper[4760]: E1204 12:14:20.864241 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:20 crc kubenswrapper[4760]: E1204 12:14:20.864394 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.868506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.868550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.868568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.868590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.868606 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.971490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.971526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.971536 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.971552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:20 crc kubenswrapper[4760]: I1204 12:14:20.971563 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:20Z","lastTransitionTime":"2025-12-04T12:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.074798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.074855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.074869 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.074887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.074901 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.177731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.177764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.177772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.177805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.177814 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.281457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.281507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.281519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.281537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.281547 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.384799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.384898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.384916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.384943 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.384962 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.487837 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.487902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.487911 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.487927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.487938 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.591151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.591201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.591229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.591246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.591256 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.694540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.694587 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.694601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.694624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.694637 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.797700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.797828 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.798155 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.798644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.798721 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.863662 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.863935 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.865785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.865827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.865840 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.865868 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.865880 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.879941 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.888841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.888873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.888884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.888901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.888912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.903692 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.909073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.909298 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.909408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.909664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.909814 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.926223 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.933098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.933542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.933605 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.933677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.933740 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.948994 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.954420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.954599 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.954669 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.954749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.954835 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.969717 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:21Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:21 crc kubenswrapper[4760]: E1204 12:14:21.969855 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.972283 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.972322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.972334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.972352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:21 crc kubenswrapper[4760]: I1204 12:14:21.972362 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:21Z","lastTransitionTime":"2025-12-04T12:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.075270 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.075558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.075626 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.075714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.075781 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.179572 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.180020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.180250 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.180505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.180699 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.284046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.284085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.284095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.284112 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.284124 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.387889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.387977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.387993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.388170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.388263 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.491405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.491712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.491798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.491902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.492028 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.595037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.595090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.595100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.595116 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.595126 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.698330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.699396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.699514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.699624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.699708 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.802070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.802117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.802132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.802152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.802163 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.864096 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.864203 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:22 crc kubenswrapper[4760]: E1204 12:14:22.864259 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.864360 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:22 crc kubenswrapper[4760]: E1204 12:14:22.864544 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:22 crc kubenswrapper[4760]: E1204 12:14:22.864706 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.906885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.906927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.906940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.906959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:22 crc kubenswrapper[4760]: I1204 12:14:22.906975 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:22Z","lastTransitionTime":"2025-12-04T12:14:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.009099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.009131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.009142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.009157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.009167 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.112441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.112476 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.112485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.112500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.112509 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.214720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.214748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.214755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.214769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.214778 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.316922 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.316977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.316996 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.317018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.317033 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.419576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.419638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.419660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.419682 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.419695 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.523635 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.523699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.523710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.523727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.523738 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.626753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.626915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.626927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.626946 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.626956 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.729125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.729180 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.729190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.729204 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.729240 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.832276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.832337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.832350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.832367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.832382 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.866941 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:23 crc kubenswrapper[4760]: E1204 12:14:23.867151 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.935578 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.935648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.935659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.935681 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:23 crc kubenswrapper[4760]: I1204 12:14:23.935696 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:23Z","lastTransitionTime":"2025-12-04T12:14:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.038014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.038052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.038060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.038074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.038082 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.141521 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.141568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.141577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.141593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.141602 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.248770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.248825 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.248839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.248859 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.248872 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.351695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.351734 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.351743 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.351759 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.351769 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.454356 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.454422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.454433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.454455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.454474 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.556865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.556912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.556921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.556936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.556945 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.659884 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.659979 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.659992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.660030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.660046 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.762814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.762870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.762883 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.762901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.762917 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.864003 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.864001 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:24 crc kubenswrapper[4760]: E1204 12:14:24.864147 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.864035 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:24 crc kubenswrapper[4760]: E1204 12:14:24.864332 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:24 crc kubenswrapper[4760]: E1204 12:14:24.864454 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.865282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.865323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.865338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.865355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.865371 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.968398 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.968499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.968514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.968541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:24 crc kubenswrapper[4760]: I1204 12:14:24.968563 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:24Z","lastTransitionTime":"2025-12-04T12:14:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.070958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.071013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.071022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.071035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.071058 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.173464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.173506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.173519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.173537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.173548 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.276586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.276631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.276643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.276666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.276685 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.380268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.380305 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.380313 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.380327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.380337 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.483606 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.483654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.483664 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.483682 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.483698 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.585888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.586532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.586650 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.586728 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.586809 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.689160 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.689223 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.689234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.689249 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.689259 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.792127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.792185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.792196 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.792227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.792238 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.863483 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:25 crc kubenswrapper[4760]: E1204 12:14:25.863616 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.894662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.894714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.894726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.894746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.894761 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.996644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.996707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.996719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.996737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:25 crc kubenswrapper[4760]: I1204 12:14:25.996748 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:25Z","lastTransitionTime":"2025-12-04T12:14:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.100011 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.100058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.100069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.100090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.100101 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.202917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.202972 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.202984 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.203004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.203025 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.307085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.307379 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.307396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.307413 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.307423 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.409688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.410297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.410310 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.410327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.410337 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.513492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.513551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.513566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.514417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.514464 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.616827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.616864 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.616872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.616887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.616896 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.719288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.719341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.719352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.719374 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.719387 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.822269 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.822304 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.822315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.822331 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.822341 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.863551 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.863634 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:26 crc kubenswrapper[4760]: E1204 12:14:26.863663 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.863778 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:26 crc kubenswrapper[4760]: E1204 12:14:26.863805 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:26 crc kubenswrapper[4760]: E1204 12:14:26.863935 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.925071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.925113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.925122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.925137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:26 crc kubenswrapper[4760]: I1204 12:14:26.925146 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:26Z","lastTransitionTime":"2025-12-04T12:14:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.027981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.028026 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.028035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.028052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.028067 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.130907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.130937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.130951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.130976 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.130986 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.233490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.233533 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.233542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.233559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.233571 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.336038 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.336078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.336086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.336100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.336109 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.438769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.438835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.438845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.438860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.438869 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.542318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.542345 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.542354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.542367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.542374 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.644714 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.644762 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.644772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.644788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.644800 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.747912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.747971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.747985 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.748008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.748022 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.851231 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.851278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.851288 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.851305 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.851315 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.863590 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:27 crc kubenswrapper[4760]: E1204 12:14:27.863922 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.878663 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.902577 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.937649 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.953795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.953850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.953860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.953876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.953884 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:27Z","lastTransitionTime":"2025-12-04T12:14:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.964571 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.983753 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:27 crc kubenswrapper[4760]: I1204 12:14:27.999184 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:27Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.017432 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.035335 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.050009 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.056538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.056586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.056597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.056614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.056626 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.064475 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.078502 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.090555 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.106116 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.119747 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.133987 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.145464 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.158901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.158989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.159003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.159022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.159050 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.167473 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.189874 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:28Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.261446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.261487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.261500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.261538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.261549 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.364686 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.364737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.364751 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.364768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.364782 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.468131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.468167 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.468178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.468195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.468223 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.570566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.570656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.570676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.570701 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.570731 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.673955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.674007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.674017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.674037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.674048 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.777150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.777229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.777247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.777266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.777278 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.863730 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.863770 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.863829 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:28 crc kubenswrapper[4760]: E1204 12:14:28.863911 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:28 crc kubenswrapper[4760]: E1204 12:14:28.864000 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:28 crc kubenswrapper[4760]: E1204 12:14:28.864070 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.879994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.880086 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.880105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.880123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.880135 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.982839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.982891 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.982902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.982920 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:28 crc kubenswrapper[4760]: I1204 12:14:28.982935 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:28Z","lastTransitionTime":"2025-12-04T12:14:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.085924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.085963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.085973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.085994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.086013 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.189073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.189108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.189117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.189131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.189140 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.292490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.292545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.292574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.292609 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.292618 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.396381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.396429 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.396441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.396459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.396471 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.499198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.499252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.499262 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.499282 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.499292 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.601976 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.602028 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.602039 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.602057 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.602066 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.705108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.705180 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.705192 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.705231 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.705247 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.808887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.808957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.808969 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.808991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.809007 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.864620 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:29 crc kubenswrapper[4760]: E1204 12:14:29.864752 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.881592 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.912247 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.912307 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.912318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.912344 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:29 crc kubenswrapper[4760]: I1204 12:14:29.912357 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:29Z","lastTransitionTime":"2025-12-04T12:14:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.014881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.014921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.014933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.014949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.014960 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.118371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.118408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.118419 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.118436 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.118446 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.221442 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.221495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.221506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.221525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.221535 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.323963 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.324008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.324016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.324030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.324044 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.426414 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.426453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.426462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.426477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.426486 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.535533 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.535637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.535647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.535663 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.535673 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.638558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.638620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.638631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.638651 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.638663 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.740967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.741002 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.741010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.741025 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.741035 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.844783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.844866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.844879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.844902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.844916 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.864445 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.864540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.864596 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:30 crc kubenswrapper[4760]: E1204 12:14:30.865166 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:30 crc kubenswrapper[4760]: E1204 12:14:30.865432 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:30 crc kubenswrapper[4760]: E1204 12:14:30.865520 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.865673 4760 scope.go:117] "RemoveContainer" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" Dec 04 12:14:30 crc kubenswrapper[4760]: E1204 12:14:30.865934 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.947443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.947517 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.947526 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.947541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:30 crc kubenswrapper[4760]: I1204 12:14:30.947553 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:30Z","lastTransitionTime":"2025-12-04T12:14:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.050894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.050978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.050990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.051008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.051021 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.153746 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.153810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.153823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.153847 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.153865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.257511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.257543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.257550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.257563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.257572 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.362782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.362838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.362849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.362871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.362885 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.466775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.466832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.466845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.466861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.466872 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.571349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.571499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.571513 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.571561 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.571572 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.675764 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.675808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.675817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.675853 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.675863 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.779248 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.779342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.779354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.779375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.779387 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.863867 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:31 crc kubenswrapper[4760]: E1204 12:14:31.864041 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.882363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.882404 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.882414 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.882432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.882444 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.931064 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:31 crc kubenswrapper[4760]: E1204 12:14:31.931202 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:31 crc kubenswrapper[4760]: E1204 12:14:31.931315 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:15:03.931260887 +0000 UTC m=+106.972707454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.986324 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.986372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.986382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.986403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:31 crc kubenswrapper[4760]: I1204 12:14:31.986418 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:31Z","lastTransitionTime":"2025-12-04T12:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.088992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.089128 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.089166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.089196 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.089268 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.125757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.125789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.125806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.125826 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.125851 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.139514 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:32Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.145184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.145263 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.145274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.145292 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.145302 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.161839 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:32Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.165796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.165824 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.165834 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.165850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.165858 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.211116 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:32Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.216951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.217016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.217027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.217045 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.217097 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.231493 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:32Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.236275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.236385 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.236405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.236435 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.236450 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.250414 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:32Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.250600 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.253645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.253680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.253692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.253715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.253731 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.356881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.356921 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.356940 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.356959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.356971 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.459931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.459985 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.459998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.460016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.460027 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.563529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.563612 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.563623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.563684 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.563695 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.667390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.667445 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.667458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.667478 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.667491 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.770968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.771026 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.771036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.771083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.771095 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.863392 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.863560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.863653 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.863688 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.863796 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:32 crc kubenswrapper[4760]: E1204 12:14:32.863881 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.874611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.874665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.874679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.874700 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.874713 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.977489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.977773 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.977889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.978012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:32 crc kubenswrapper[4760]: I1204 12:14:32.978089 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:32Z","lastTransitionTime":"2025-12-04T12:14:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.081588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.081655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.081668 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.081688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.081702 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.183989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.184028 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.184040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.184056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.184068 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.286850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.286890 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.286901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.286915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.286926 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.389417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.389472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.389489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.389509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.389521 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.492064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.492109 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.492120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.492136 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.492147 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.594566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.594631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.594643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.594670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.594684 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.696564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.696601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.696610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.696624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.696634 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.798696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.798757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.798768 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.798786 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.798798 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.863622 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:33 crc kubenswrapper[4760]: E1204 12:14:33.863773 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.901085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.901121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.901132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.901147 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:33 crc kubenswrapper[4760]: I1204 12:14:33.901157 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:33Z","lastTransitionTime":"2025-12-04T12:14:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.004154 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.004198 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.004227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.004264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.004276 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.107275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.107312 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.107322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.107338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.107349 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.210612 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.210665 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.210678 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.210697 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.210707 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.312749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.312799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.312810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.312830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.312843 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.415500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.415549 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.415558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.415575 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.415585 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.518243 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.518297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.518312 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.518328 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.518337 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.621563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.621620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.621642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.621661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.621670 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.724069 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.724115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.724126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.724143 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.724152 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.826189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.826260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.826271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.826289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.826299 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.863575 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.863580 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.863688 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:34 crc kubenswrapper[4760]: E1204 12:14:34.863812 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:34 crc kubenswrapper[4760]: E1204 12:14:34.863928 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:34 crc kubenswrapper[4760]: E1204 12:14:34.863997 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.928273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.928334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.928355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.928375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:34 crc kubenswrapper[4760]: I1204 12:14:34.928389 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:34Z","lastTransitionTime":"2025-12-04T12:14:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.030760 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.030810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.030820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.030838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.030848 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.132928 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.132986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.132998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.133017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.133028 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.235019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.235095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.235111 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.235150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.235166 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.337515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.337670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.337683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.337705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.337719 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.441593 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.441655 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.441704 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.441723 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.441734 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.544012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.544064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.544078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.544095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.544105 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.646403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.646462 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.646470 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.646488 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.646498 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.749064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.749113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.749124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.749140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.749151 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.851499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.851546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.851554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.851568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.851577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.863449 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:35 crc kubenswrapper[4760]: E1204 12:14:35.863634 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.954497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.954574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.954598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.954629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:35 crc kubenswrapper[4760]: I1204 12:14:35.954653 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:35Z","lastTransitionTime":"2025-12-04T12:14:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.057750 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.057811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.057823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.057841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.057852 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.160192 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.160265 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.160276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.160293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.160305 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.263537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.263579 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.263597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.263617 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.263630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.366126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.366184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.366194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.366242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.366258 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.453200 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/0.log" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.453269 4760 generic.go:334] "Generic (PLEG): container finished" podID="017b9fc1-6db4-4786-81f1-6cb9b09c90a3" containerID="1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249" exitCode=1 Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.453299 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerDied","Data":"1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.453671 4760 scope.go:117] "RemoveContainer" containerID="1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.469121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.469443 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.469454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.469471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.469481 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.476298 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.491076 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.504397 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.518265 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.535567 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.549689 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.561571 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.572722 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.572772 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.572790 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.572813 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.572827 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.575309 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.592044 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.608086 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.623301 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.638047 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.655488 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.669874 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.675874 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.675898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.675908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.675924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.675934 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.681114 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.703938 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.719728 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.732196 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.748136 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:36Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.779739 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.779782 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.779791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.779804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.779816 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.863752 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:36 crc kubenswrapper[4760]: E1204 12:14:36.863926 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.864008 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:36 crc kubenswrapper[4760]: E1204 12:14:36.864068 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.864117 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:36 crc kubenswrapper[4760]: E1204 12:14:36.864188 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.891844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.891879 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.891887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.891900 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.891910 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.994610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.994653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.994666 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.994683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:36 crc kubenswrapper[4760]: I1204 12:14:36.994693 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:36Z","lastTransitionTime":"2025-12-04T12:14:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.097471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.097536 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.097545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.097559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.097567 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.204157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.204190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.204199 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.204242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.204256 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.307819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.307863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.307877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.307894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.307905 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.411140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.411197 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.411227 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.411249 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.411267 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.460048 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/0.log" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.460136 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerStarted","Data":"d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.475390 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.491263 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.505843 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.514241 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.514308 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.514323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.514352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.514368 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.521908 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.536552 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.551230 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.563318 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.575667 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.601310 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.617835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.617901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.617916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.617939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.617954 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.619795 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.714775 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.720534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.720591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.720605 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.720621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.720630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.739331 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.756578 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.782462 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.805132 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.823121 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.824067 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.824117 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.824127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.824146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.824158 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.844362 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.858494 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.863683 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:37 crc kubenswrapper[4760]: E1204 12:14:37.863898 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.872272 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.883727 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.902905 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.916856 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.927087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.927123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.927134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.927172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.927186 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:37Z","lastTransitionTime":"2025-12-04T12:14:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.932106 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.949663 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.963767 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:37 crc kubenswrapper[4760]: I1204 12:14:37.985325 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:37Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.002962 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.024161 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.029788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.029824 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.029835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.029854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.029865 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.040784 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.059663 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.076571 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.092312 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.112926 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.127764 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.131908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.131930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.131938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.131951 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.131959 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.144761 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.161254 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.176875 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.190234 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:38Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.233770 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.233817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.233829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.233846 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.233858 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.336849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.336888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.336897 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.336910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.336920 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.439482 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.439530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.439539 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.439559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.439570 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.542396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.542441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.542452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.542467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.542479 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.644662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.644726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.644740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.644757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.644769 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.747459 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.747537 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.747546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.747560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.747569 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.849817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.849861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.849872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.849889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.849898 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.863324 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.863364 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.863412 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:38 crc kubenswrapper[4760]: E1204 12:14:38.863442 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:38 crc kubenswrapper[4760]: E1204 12:14:38.863481 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:38 crc kubenswrapper[4760]: E1204 12:14:38.863538 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.952484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.952524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.952532 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.952546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:38 crc kubenswrapper[4760]: I1204 12:14:38.952555 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:38Z","lastTransitionTime":"2025-12-04T12:14:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.055108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.055150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.055159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.055175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.055185 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.157335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.157370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.157380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.157394 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.157403 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.260075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.260118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.260127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.260141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.260151 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.362981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.363043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.363060 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.363088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.363104 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.464998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.465046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.465056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.465076 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.465087 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.568040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.568133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.568157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.568187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.568245 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.671201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.671338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.671358 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.671381 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.671397 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.773457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.773514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.773530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.773551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.773565 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.863503 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:39 crc kubenswrapper[4760]: E1204 12:14:39.863665 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.875428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.875479 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.875490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.875506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.875517 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.977830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.977866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.977875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.977890 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:39 crc kubenswrapper[4760]: I1204 12:14:39.977900 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:39Z","lastTransitionTime":"2025-12-04T12:14:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.080252 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.080326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.080339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.080363 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.080414 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.184174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.184258 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.184271 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.184293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.184308 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.287293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.287342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.287354 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.287373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.287384 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.389673 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.389743 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.389754 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.389775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.389790 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.492986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.493019 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.493027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.493041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.493052 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.596012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.596074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.596087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.596106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.596119 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.699575 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.699648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.699661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.699683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.699696 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.801909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.801968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.801981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.802001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.802013 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.863854 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.863946 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:40 crc kubenswrapper[4760]: E1204 12:14:40.864004 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.864017 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:40 crc kubenswrapper[4760]: E1204 12:14:40.864129 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:40 crc kubenswrapper[4760]: E1204 12:14:40.864274 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.905036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.905087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.905099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.905118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:40 crc kubenswrapper[4760]: I1204 12:14:40.905130 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:40Z","lastTransitionTime":"2025-12-04T12:14:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.006968 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.007007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.007016 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.007031 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.007041 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.110052 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.110092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.110100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.110120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.110135 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.212487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.212562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.212573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.212588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.212598 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.314957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.315035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.315049 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.315065 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.315076 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.417475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.417543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.417556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.417574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.417585 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.520732 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.520769 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.520778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.520793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.520803 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.624695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.624802 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.624816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.624876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.624896 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.730822 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.730941 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.730960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.730986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.731005 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.834297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.834339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.834349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.834366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.834378 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.864291 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:41 crc kubenswrapper[4760]: E1204 12:14:41.864554 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.937713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.937771 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.937787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.937809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:41 crc kubenswrapper[4760]: I1204 12:14:41.937822 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:41Z","lastTransitionTime":"2025-12-04T12:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.040511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.040557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.040569 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.040589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.040604 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.142738 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.142783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.142792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.142806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.142816 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.246751 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.246850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.246866 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.246895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.246912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.344048 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.344290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.344387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.344415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.344429 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.361554 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:42Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.366858 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.366923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.366936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.366954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.366967 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.381010 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:42Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.392795 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.392854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.392875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.392901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.392915 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.407637 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:42Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.413071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.413150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.413166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.413196 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.413230 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.428712 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:42Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.433601 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.433648 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.433661 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.433685 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.433699 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.447547 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:42Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.447676 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.450045 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.450094 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.450106 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.450126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.450140 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.553992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.554045 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.554056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.554078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.554090 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.658364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.658409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.658421 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.658472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.658483 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.761037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.761075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.761084 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.761100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.761111 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.863260 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.863283 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.863365 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.863402 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.863488 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.864102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.864130 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.864139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.864153 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: E1204 12:14:42.864083 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.864167 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.966813 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.967643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.967659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.967677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:42 crc kubenswrapper[4760]: I1204 12:14:42.967689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:42Z","lastTransitionTime":"2025-12-04T12:14:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.070494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.070551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.070567 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.070586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.070598 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.173004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.173048 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.173133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.173173 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.173186 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.276595 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.276643 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.276656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.276677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.276689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.378886 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.378922 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.378933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.378952 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.378965 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.480986 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.481022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.481033 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.481049 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.481058 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.583842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.583909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.583936 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.583962 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.583973 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.686845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.686924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.686939 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.686955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.686968 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.789841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.789916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.789929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.789949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.789961 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.863889 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:43 crc kubenswrapper[4760]: E1204 12:14:43.864063 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.893003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.893061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.893074 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.893091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.893102 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.995534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.995600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.995610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.995624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:43 crc kubenswrapper[4760]: I1204 12:14:43.995634 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:43Z","lastTransitionTime":"2025-12-04T12:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.099270 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.099337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.099350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.099373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.099387 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.202138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.202182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.202191 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.202220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.202228 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.304861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.304923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.304938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.304955 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.304968 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.407414 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.407464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.407474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.407493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.407504 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.510369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.510444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.510458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.510485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.510506 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.612359 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.612408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.612420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.612437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.612450 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.715091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.715131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.715145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.715165 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.715180 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.817667 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.817720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.817729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.817747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.817759 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.863490 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.863556 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.863509 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:44 crc kubenswrapper[4760]: E1204 12:14:44.863638 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:44 crc kubenswrapper[4760]: E1204 12:14:44.863853 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:44 crc kubenswrapper[4760]: E1204 12:14:44.863967 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.920278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.920332 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.920345 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.920362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:44 crc kubenswrapper[4760]: I1204 12:14:44.920375 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:44Z","lastTransitionTime":"2025-12-04T12:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.023041 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.023104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.023115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.023131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.023142 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.127417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.127499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.127520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.127554 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.127577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.230422 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.230479 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.230489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.230509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.230521 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.333022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.333091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.333103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.333127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.333141 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.436056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.436108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.436122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.436140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.436149 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.539428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.539503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.539546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.539568 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.539581 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.642299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.642359 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.642370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.642389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.642401 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.744729 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.744785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.744796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.744814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.744826 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.847375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.847461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.847515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.847540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.847555 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.863980 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:45 crc kubenswrapper[4760]: E1204 12:14:45.864401 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.864618 4760 scope.go:117] "RemoveContainer" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.950146 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.950494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.950510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.950564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:45 crc kubenswrapper[4760]: I1204 12:14:45.950578 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:45Z","lastTransitionTime":"2025-12-04T12:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.053938 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.053997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.054013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.054040 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.054055 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.157382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.157548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.157558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.157575 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.157585 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.260798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.260844 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.260855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.260876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.260889 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.364118 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.364165 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.364178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.364195 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.364241 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.467811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.467873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.467890 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.467915 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.467935 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.512406 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/2.log" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.519101 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.519742 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.536872 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.554797 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.569312 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.571098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.571148 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.571159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.571174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.571185 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.588225 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.602529 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.615732 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.630935 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.645578 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.661700 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.675012 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.675073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.675089 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.675115 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.675130 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.681429 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.697040 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.711428 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.731598 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.758820 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.776615 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.777967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.778004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.778015 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.778033 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.778044 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.793485 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.813885 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.830959 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.853320 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:46Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.863493 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:46 crc kubenswrapper[4760]: E1204 12:14:46.863657 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.863490 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.863870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:46 crc kubenswrapper[4760]: E1204 12:14:46.863959 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:46 crc kubenswrapper[4760]: E1204 12:14:46.864257 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.880653 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.880706 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.880718 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.880736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.880747 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.982991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.983024 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.983035 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.983050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:46 crc kubenswrapper[4760]: I1204 12:14:46.983061 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:46Z","lastTransitionTime":"2025-12-04T12:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.085369 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.085402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.085411 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.085425 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.085435 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.118911 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.119183 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:51.119145752 +0000 UTC m=+154.160592379 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.119276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.119378 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.119548 4760 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.119607 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:15:51.119596654 +0000 UTC m=+154.161043401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.119643 4760 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.119734 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 12:15:51.119707156 +0000 UTC m=+154.161153923 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.188290 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.188337 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.188350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.188367 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.188379 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.220662 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.220812 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.221937 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.221954 4760 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.221967 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.222012 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 12:15:51.22199262 +0000 UTC m=+154.263439187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.222203 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.222267 4760 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.222280 4760 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.222358 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 12:15:51.222332019 +0000 UTC m=+154.263778646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.290942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.290990 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.290999 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.291017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.291027 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.394243 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.394284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.394293 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.394310 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.394320 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.496860 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.496909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.496917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.496932 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.496943 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.599456 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.599501 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.599509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.599524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.599538 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.702132 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.702175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.702184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.702203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.702230 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.805112 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.805170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.805183 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.805203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.805240 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.863681 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:47 crc kubenswrapper[4760]: E1204 12:14:47.863825 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.875955 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.897573 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.910010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.910077 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.910091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.910109 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.910122 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:47Z","lastTransitionTime":"2025-12-04T12:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.911104 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.924989 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.942544 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.956486 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.977973 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:47 crc kubenswrapper[4760]: I1204 12:14:47.993460 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:47Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.008062 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.014276 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.014362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.014418 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.014445 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.014461 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.023244 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.038320 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.051842 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.072852 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.090302 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.108361 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.116992 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.117051 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.117064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.117088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.117106 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.130356 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.147307 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.163191 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.175872 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:48Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.220464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.220503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.220515 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.220534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.220547 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.323953 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.324018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.324034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.324058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.324074 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.427320 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.428044 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.428092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.428131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.428150 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.530719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.530783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.530796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.530816 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.530832 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.633654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.633710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.633722 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.633743 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.633756 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.736851 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.736898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.736908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.736923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.736933 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.839693 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.839753 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.839767 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.839792 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.839805 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.864010 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.864169 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:48 crc kubenswrapper[4760]: E1204 12:14:48.864303 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.864179 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:48 crc kubenswrapper[4760]: E1204 12:14:48.864434 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:48 crc kubenswrapper[4760]: E1204 12:14:48.864511 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.942378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.942440 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.942451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.942474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:48 crc kubenswrapper[4760]: I1204 12:14:48.942489 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:48Z","lastTransitionTime":"2025-12-04T12:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.045460 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.045511 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.045524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.045543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.045557 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.148387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.148477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.148493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.148523 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.148538 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.252093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.252157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.252168 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.252190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.252204 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.355405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.355449 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.355457 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.355473 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.355485 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.457550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.457608 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.457619 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.457636 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.457649 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.531927 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/3.log" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.533249 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/2.log" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.535768 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" exitCode=1 Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.535813 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.535862 4760 scope.go:117] "RemoveContainer" containerID="ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.536536 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:14:49 crc kubenswrapper[4760]: E1204 12:14:49.536746 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.551727 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.559833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.559877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.559889 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.559908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.559921 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.579350 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.594864 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.609597 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.625714 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.637969 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.660174 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:48Z\\\",\\\"message\\\":\\\"e column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 12:14:47.600905 6809 services_controller.go:434] Service openshift-kube-scheduler/scheduler retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{scheduler openshift-kube-scheduler 66d01bf5-1923-42ac-8d2b-24819bd09205 4792 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-scheduler] map[operator.openshift.io/spec-hash:f185087b7610499b49263c17685abe7f251a50c890808284a072687bf6d73275 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10259 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{scheduler: true,},ClusterIP:10.217.4.169,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.169],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPoli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.662646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.662712 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.662727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.662747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.662785 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.680857 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.693985 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.708406 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.721943 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.734624 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.747636 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.762798 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.765493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.765528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.765541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.765558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.765570 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.781484 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.797819 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.813128 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.826680 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.859557 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:49Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.865822 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:49 crc kubenswrapper[4760]: E1204 12:14:49.865983 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.876736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.876781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.876791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.876805 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.876813 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.979327 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.979371 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.979382 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.979397 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:49 crc kubenswrapper[4760]: I1204 12:14:49.979407 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:49Z","lastTransitionTime":"2025-12-04T12:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.081709 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.082091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.082194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.082319 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.082400 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.184692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.184730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.184744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.184763 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.184776 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.287872 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.287908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.287917 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.287931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.287941 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.390087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.390157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.390167 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.390184 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.390193 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.492756 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.492785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.492796 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.492811 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.492821 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.541665 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/3.log" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.595638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.595677 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.595688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.595705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.595716 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.698474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.698508 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.698516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.698545 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.698555 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.802334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.802591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.802602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.802618 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.802629 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.864183 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.864340 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:50 crc kubenswrapper[4760]: E1204 12:14:50.864490 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.864770 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:50 crc kubenswrapper[4760]: E1204 12:14:50.864866 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:50 crc kubenswrapper[4760]: E1204 12:14:50.865879 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.904624 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.904724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.904733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.904748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:50 crc kubenswrapper[4760]: I1204 12:14:50.904757 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:50Z","lastTransitionTime":"2025-12-04T12:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.006801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.006850 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.006863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.006881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.006892 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.109787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.109865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.109875 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.109892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.109904 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.213245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.213286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.213299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.213334 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.213344 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.315819 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.315876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.315886 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.315902 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.315912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.420960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.421004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.421018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.421036 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.421048 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.524103 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.524137 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.524145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.524159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.524168 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.627177 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.627235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.627244 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.627260 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.627271 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.730171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.730234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.730246 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.730264 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.730378 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.833372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.833432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.833444 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.833461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.833480 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.864246 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:51 crc kubenswrapper[4760]: E1204 12:14:51.864427 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.936316 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.936364 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.936374 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.936391 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:51 crc kubenswrapper[4760]: I1204 12:14:51.936401 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:51Z","lastTransitionTime":"2025-12-04T12:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.039378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.039419 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.039430 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.039447 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.039457 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.142659 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.142705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.142717 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.142735 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.142748 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.245976 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.246032 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.246042 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.246061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.246071 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.348841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.348885 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.348895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.348913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.348924 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.451639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.451679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.451692 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.451708 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.451718 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.492814 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.492870 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.492888 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.492910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.492921 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.507039 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.511366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.511408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.511416 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.511431 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.511440 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.526316 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.530658 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.530699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.530710 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.530726 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.530735 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.546357 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.553355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.553539 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.553551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.553576 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.553592 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.571819 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.577901 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.577942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.577954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.577972 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.577985 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.594430 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:52Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.594610 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.596873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.596903 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.596913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.596931 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.596944 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.700092 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.700131 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.700140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.700156 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.700165 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.803783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.803830 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.803841 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.803861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.803875 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.864269 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.864470 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.864745 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.864814 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.865159 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:52 crc kubenswrapper[4760]: E1204 12:14:52.865261 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.907649 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.907704 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.907724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.907747 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:52 crc kubenswrapper[4760]: I1204 12:14:52.907759 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:52Z","lastTransitionTime":"2025-12-04T12:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.011379 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.011451 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.011466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.011490 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.011507 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.113862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.113923 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.113935 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.113957 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.113976 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.217025 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.217100 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.217113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.217138 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.217155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.319827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.320203 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.320330 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.320432 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.320512 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.423073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.423120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.423129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.423145 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.423156 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.525965 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.526534 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.526628 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.526695 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.526758 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.629794 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.629863 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.629877 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.629898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.629913 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.732542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.732585 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.732597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.732614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.732626 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.836398 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.836437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.836452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.836471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.836486 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.864521 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:53 crc kubenswrapper[4760]: E1204 12:14:53.864983 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.939744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.939809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.939820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.939842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:53 crc kubenswrapper[4760]: I1204 12:14:53.939853 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:53Z","lastTransitionTime":"2025-12-04T12:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.043133 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.043182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.043308 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.043333 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.043346 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.146509 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.146558 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.146570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.146587 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.146601 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.249361 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.249437 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.249461 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.249489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.249508 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.352615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.352670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.352680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.352696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.352705 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.455452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.455494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.455502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.455520 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.455532 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.557475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.557525 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.557536 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.557594 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.557613 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.660366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.660405 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.660415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.660458 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.660467 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.763300 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.763339 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.763350 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.763376 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.763388 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.863721 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:54 crc kubenswrapper[4760]: E1204 12:14:54.863893 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.863740 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.863736 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:54 crc kubenswrapper[4760]: E1204 12:14:54.864008 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:54 crc kubenswrapper[4760]: E1204 12:14:54.864072 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.865452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.865483 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.865495 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.865512 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.865524 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.968349 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.968410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.968427 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.968447 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:54 crc kubenswrapper[4760]: I1204 12:14:54.968460 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:54Z","lastTransitionTime":"2025-12-04T12:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.071056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.071149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.071168 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.071194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.071268 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.263742 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.263785 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.263799 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.263818 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.263832 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.366013 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.366073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.366085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.366102 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.366113 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.468806 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.468862 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.468881 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.468900 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.468912 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.571446 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.571484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.571494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.571510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.571519 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.675150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.675202 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.675224 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.675248 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.675264 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.778913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.778956 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.778971 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.778989 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.779001 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.864258 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:55 crc kubenswrapper[4760]: E1204 12:14:55.864407 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.882528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.882575 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.882588 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.882605 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.882965 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.985331 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.985388 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.985400 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.985428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:55 crc kubenswrapper[4760]: I1204 12:14:55.985442 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:55Z","lastTransitionTime":"2025-12-04T12:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.088037 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.088078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.088090 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.088110 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.088121 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.190831 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.190871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.190880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.190894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.190902 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.293226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.293274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.293286 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.293304 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.293317 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.396719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.396774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.396784 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.396801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.396812 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.499410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.499477 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.499493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.499514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.499536 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.602541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.602598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.602614 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.602640 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.602655 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.705552 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.705616 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.705631 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.705647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.705657 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.809175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.809254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.809266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.809289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.809301 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.864122 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.864174 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.864184 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:56 crc kubenswrapper[4760]: E1204 12:14:56.864295 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:56 crc kubenswrapper[4760]: E1204 12:14:56.864381 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:56 crc kubenswrapper[4760]: E1204 12:14:56.864457 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.911660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.911699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.911707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.911720 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:56 crc kubenswrapper[4760]: I1204 12:14:56.911729 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:56Z","lastTransitionTime":"2025-12-04T12:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.014656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.014724 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.014737 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.014757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.014771 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.116832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.116887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.116898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.116913 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.116922 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.219170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.219236 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.219248 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.219266 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.219278 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.322839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.322893 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.322909 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.322925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.322935 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.425187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.425341 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.425379 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.425397 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.425408 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.527919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.527967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.527978 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.527996 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.528006 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.630730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.630776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.630788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.630804 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.630816 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.732835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.732887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.732898 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.732916 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.732927 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.835925 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.835975 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.835987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.836003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.836014 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.864022 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:57 crc kubenswrapper[4760]: E1204 12:14:57.864182 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.892028 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba2d614b5d3fe6655c67a9885805459ffa7549d65b466ed5cbbd7123f6cb289f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:14Z\\\",\\\"message\\\":\\\"ovnkube\\\\nI1204 12:14:14.377429 6406 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1204 12:14:14.377434 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI1204 12:14:14.377441 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI1204 12:14:14.377456 6406 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 926.492µs\\\\nI1204 12:14:14.377461 6406 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 3.185777ms\\\\nI1204 12:14:14.377449 6406 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}\\\\nF1204 12:14:14.377510 6406 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:48Z\\\",\\\"message\\\":\\\"e column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 12:14:47.600905 6809 services_controller.go:434] Service openshift-kube-scheduler/scheduler retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{scheduler openshift-kube-scheduler 66d01bf5-1923-42ac-8d2b-24819bd09205 4792 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-scheduler] map[operator.openshift.io/spec-hash:f185087b7610499b49263c17685abe7f251a50c890808284a072687bf6d73275 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10259 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{scheduler: true,},ClusterIP:10.217.4.169,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.169],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPoli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.906190 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.919860 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.937231 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.938590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.938699 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.938715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.939022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.939046 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:57Z","lastTransitionTime":"2025-12-04T12:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.952331 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.970170 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:57 crc kubenswrapper[4760]: I1204 12:14:57.986633 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:57Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.003628 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.019204 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.034153 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.043010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.043058 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.043070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.043087 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.043099 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.047602 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.074202 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.090661 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.114634 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.131233 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.145176 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.145229 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.145272 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.145291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.145358 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.147914 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.165278 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.179618 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.191218 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:14:58Z is after 2025-08-24T17:21:41Z" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.247854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.247907 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.247919 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.247937 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.247948 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.351034 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.351073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.351082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.351097 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.351107 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.453323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.453377 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.453390 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.453408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.453421 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.556958 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.557014 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.557030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.557050 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.557064 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.660095 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.660150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.660166 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.660187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.660203 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.764450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.764540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.764567 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.764602 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.764630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.863435 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.863470 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.863440 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:14:58 crc kubenswrapper[4760]: E1204 12:14:58.863581 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:14:58 crc kubenswrapper[4760]: E1204 12:14:58.863628 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:14:58 crc kubenswrapper[4760]: E1204 12:14:58.863688 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.867043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.867078 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.867091 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.867107 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.867118 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.970178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.970245 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.970257 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.970274 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:58 crc kubenswrapper[4760]: I1204 12:14:58.970285 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:58Z","lastTransitionTime":"2025-12-04T12:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.073441 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.073493 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.073530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.073548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.073562 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.176727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.176808 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.176821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.176838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.176850 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.279010 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.279299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.279372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.279510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.279591 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.382079 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.382121 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.382129 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.382144 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.382155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.484557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.484809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.484961 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.485063 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.485126 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.587930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.587972 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.587981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.587997 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.588008 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.690755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.690820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.690833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.690854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.690866 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.794175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.794267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.794278 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.794296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.794316 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.864025 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:14:59 crc kubenswrapper[4760]: E1204 12:14:59.864266 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.896839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.896912 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.896924 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.896949 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.896967 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.999662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.999708 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:14:59 crc kubenswrapper[4760]: I1204 12:14:59.999717 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:14:59.999733 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:14:59.999743 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:14:59Z","lastTransitionTime":"2025-12-04T12:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.103645 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.103696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.103709 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.103731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.103746 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.206573 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.206651 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.206663 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.206683 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.206694 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.309003 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.309044 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.309056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.309073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.309084 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.412256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.412318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.412329 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.412378 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.412392 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.515155 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.515201 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.515233 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.515250 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.515261 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.618142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.618194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.618220 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.618238 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.618250 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.721475 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.721530 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.721544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.721562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.721577 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.824044 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.824108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.824123 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.824142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.824155 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.863692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:00 crc kubenswrapper[4760]: E1204 12:15:00.864408 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.863938 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:00 crc kubenswrapper[4760]: E1204 12:15:00.864516 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.863715 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:00 crc kubenswrapper[4760]: E1204 12:15:00.864582 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.926580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.926627 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.926646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.926713 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:00 crc kubenswrapper[4760]: I1204 12:15:00.926730 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:00Z","lastTransitionTime":"2025-12-04T12:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.029083 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.029161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.029172 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.029189 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.029200 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.131933 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.131987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.132000 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.132018 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.132039 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.235301 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.235977 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.236024 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.236056 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.236077 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.340297 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.340373 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.340389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.340410 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.340423 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.444150 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.444254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.444267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.444284 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.444296 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.552674 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.552787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.552803 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.552838 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.552857 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.655629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.655680 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.655691 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.655705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.655716 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.758705 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.758848 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.758865 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.758882 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.758893 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.861450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.861500 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.861510 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.861524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.861533 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.863918 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:01 crc kubenswrapper[4760]: E1204 12:15:01.864023 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.864704 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:15:01 crc kubenswrapper[4760]: E1204 12:15:01.864843 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.888454 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69907424-ac0b-4430-b508-af165754104f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:48Z\\\",\\\"message\\\":\\\"e column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1204 12:14:47.600905 6809 services_controller.go:434] Service openshift-kube-scheduler/scheduler retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{scheduler openshift-kube-scheduler 66d01bf5-1923-42ac-8d2b-24819bd09205 4792 0 2025-02-23 05:22:52 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-scheduler] map[operator.openshift.io/spec-hash:f185087b7610499b49263c17685abe7f251a50c890808284a072687bf6d73275 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10259 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{scheduler: true,},ClusterIP:10.217.4.169,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.169],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPoli\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:14:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hvpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-q8b49\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.906032 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499d924-81bc-4bd7-8148-1fd816851d20\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 12:13:40.938908 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 12:13:40.939174 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 12:13:40.945617 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3248821087/tls.crt::/tmp/serving-cert-3248821087/tls.key\\\\\\\"\\\\nI1204 12:13:41.315898 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 12:13:41.919183 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 12:13:41.919243 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 12:13:41.919270 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 12:13:41.919275 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 12:13:41.930062 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 12:13:41.930104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 12:13:41.930112 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1204 12:13:41.930111 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1204 12:13:41.930119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 12:13:41.930169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 12:13:41.930174 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 12:13:41.930180 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1204 12:13:41.934797 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.923918 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.937727 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65f76314-9511-40ed-9ad6-2220378e7e97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4018697946d3562a04e4d3d13f6f48146512bf801c2b08a1b69e3c2f49ae7028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf8ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnrr9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.955502 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-dg5hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"017b9fc1-6db4-4786-81f1-6cb9b09c90a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T12:14:35Z\\\",\\\"message\\\":\\\"2025-12-04T12:13:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0\\\\n2025-12-04T12:13:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_854218a8-169e-4204-a368-fbf1f8cd82e0 to /host/opt/cni/bin/\\\\n2025-12-04T12:13:50Z [verbose] multus-daemon started\\\\n2025-12-04T12:13:50Z [verbose] Readiness Indicator file check\\\\n2025-12-04T12:14:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:14:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cgd2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-dg5hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.963991 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.964071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.964085 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.964124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.964138 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:01Z","lastTransitionTime":"2025-12-04T12:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.970329 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2cfhd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e89efa-dd42-4b9a-9884-1a870b916762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e547074956851967e6f2c18add9f32b600f5212b1c04aa4a3b78b9dfdb08ba7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgrcc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2cfhd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:01 crc kubenswrapper[4760]: I1204 12:15:01.986159 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e76a717e-0d35-4731-89fe-2ce076575e96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://387322cb942df5fb304a1254b17999d840b446193a802a499c75c3ebe0116219\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d1d96522386dc07c992c4aa5bef79a406c087dbd88e1cf01a7c980bb6d3132\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40919bdb7a7a913f4143685de09bfb91c88dc8040e8cdc240f6238919081b047\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:01Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.007750 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"231aa51f-8636-462f-8847-b935ebbd7265\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e2eb0ea16ce707e28e35ce72cc33b082f7d2bbfd25baa83d7ac4e2d033eef25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e86984bfd2366bb7c871a6b5a97eedc60222ab63cf187a67125ac78b25d65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a0461d7524a0c3e92e1e201228088643b8d1d4a0367072d9427444d8695bbf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a80bf24ac5d1d49309bbea4bb7c7b90b62294f832682969b0c7b3b94e372dd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.023150 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f91d09c195d385055da35f0c99839722ff0a2e68f91585dc4f592d0e95195ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.038834 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.055851 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170a9d0e9f45f70db235096de452658cd186e8b1e31cf95f26b5164d28eb057e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2be5c75c73dd9409b455aae7458ef19a57f7ff46475480cf24bc089348513617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.067514 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.067564 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.067577 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.067597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.067611 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.069543 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4br74" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f60604a-f694-4df9-bb00-117eb8e9f325\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d628aeb7f3c44e46ae10ae41bf8bb286f4bd72b1c618049d9b03c7e2ea2b516a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9sj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4br74\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.084204 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"345f593d-ac28-4bf4-aed0-adbad7c3a90e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f860003aee9708aa28acebe33ab063ca2b5c066515f194080d31e4e3fddaf7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc4d142d09f2784a4c2843442f71e2988f8d15fb8f87bc87ca8c1a759c57ab1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb88n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:58Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4jxpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.097677 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"111765a1-1fb7-4181-9d4a-78de25d8922e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c65534c15fddf0804bfa1bd6742373c5f11b4c947f135d29b84eda8e2ff7ace1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://633a7ab5f84b6d483f324c2d268fc6602ef9c90ae2c24efe4057022ec95b6cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.121009 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5b75362-3896-4d4a-943d-78bf67722f2f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c594f0fcf7ceddd7c603ebd1dbd0076fda6620c9656eeb8f460ba5e20ba3281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36519c7d20f1484ceb732e4e0f175648d7aeb8764d105404bb9329f0a5f5cd7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://747753739e25a85ae82638e6a918bcf713dd1ab06b34c1a868243111243ee609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d08cba82e4325d77b01dcb9248c1b921f0fd8e026f71925d16126a672b525b68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb943a4ddea322fc6533861b073514ec5882bd239396137784e63be458c2a47b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0921a91eb935678f0535b16e4cf5a3b77ebd709f1b06d5ce227be54cdd2c320b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0f45811b38fcb9f2bb5e2632d6d7d79a8cfdf299ab5391a05be66d82171d924\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16efc3621cdf433afa07c1e67908f7032444011cca9b9d419bced35da55c0d5f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.136966 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.152772 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67453dc080dbffda61e0b4d54411e33d739e2c59278b4b9ed63ced72ecd787d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170124 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc874ce0-7f43-4ba9-921a-dd8141d738a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:13:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7a45fa4fa0b2e11ffc22b256a51c20fb5129ef6947c25f22ec5c981d02b4704\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T12:13:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db8985a81bed9848cf4d83c47c174213584f8d736d784340b77817f19a6429b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1a63dd58720aa0e401bdb3f5a51f7e33c80952b99905c31d4a64e58548b24a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b109976bbf1d917234954e60f088001329dbda489d52fa3a4c82358c6a579d13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6a677577f1699992d9661060b282cd36fd150696186e97dc6569501e1365d47\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a22987acd53d8e170fe710d296257a53820eefc29a414723add37df8e15af43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ccc825ecaf74e5aa1a5a6e9b5667a2d30050ce470dbd6a481feddda587ea0876\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T12:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T12:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk8d2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:13:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvk2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170516 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170540 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170549 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.170579 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.184918 4760 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xpngr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4fd6a47-556a-4236-9f60-0e7996e4608a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T12:14:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6dlzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T12:14:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xpngr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.273098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.273149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.273159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.273174 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.273184 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.376492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.376538 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.376553 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.376570 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.376580 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.479072 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.479114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.479125 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.479141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.479152 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.582071 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.582140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.582152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.582171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.582181 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.684809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.684867 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.684876 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.684893 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.684903 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.719839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.719893 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.719910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.719930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.719943 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.733598 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.738124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.738162 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.738171 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.738185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.738194 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.752923 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.757170 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.757225 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.757235 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.757254 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.757264 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.776275 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.783424 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.784439 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.784474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.784494 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.784504 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.798467 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.802551 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.802600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.802610 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.802626 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.802637 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.816001 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T12:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f9c326ab-1318-43cc-ac8e-7cfd64c1e669\\\",\\\"systemUUID\\\":\\\"c3d842b6-e196-432f-8258-ff304cd02e6f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T12:15:02Z is after 2025-08-24T17:21:41Z" Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.816124 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.817718 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.817778 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.817791 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.817809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.817820 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.863500 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.863636 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.863500 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.863502 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.863855 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:02 crc kubenswrapper[4760]: E1204 12:15:02.863985 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.920598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.920656 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.920675 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.920696 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:02 crc kubenswrapper[4760]: I1204 12:15:02.920709 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:02Z","lastTransitionTime":"2025-12-04T12:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.023498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.023560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.023571 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.023590 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.023602 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.127417 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.127488 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.127502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.127524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.127539 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.230068 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.230111 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.230120 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.230134 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.230143 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.333580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.333644 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.333657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.333676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.333689 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.436268 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.436342 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.436355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.436379 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.436397 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.539559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.539603 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.539642 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.539660 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.539672 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.641973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.642007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.642024 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.642043 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.642054 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.745505 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.745581 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.745611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.745646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.745672 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.848507 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.848556 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.848566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.848586 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.848599 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.864086 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:03 crc kubenswrapper[4760]: E1204 12:15:03.864337 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.946255 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:03 crc kubenswrapper[4760]: E1204 12:15:03.946501 4760 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:15:03 crc kubenswrapper[4760]: E1204 12:15:03.946617 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs podName:b4fd6a47-556a-4236-9f60-0e7996e4608a nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.946588861 +0000 UTC m=+170.988035498 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs") pod "network-metrics-daemon-xpngr" (UID: "b4fd6a47-556a-4236-9f60-0e7996e4608a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.951749 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.951783 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.951793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.951817 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:03 crc kubenswrapper[4760]: I1204 12:15:03.951827 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:03Z","lastTransitionTime":"2025-12-04T12:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.054504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.054598 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.054625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.054657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.054681 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.157453 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.157496 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.157506 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.157519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.157528 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.259787 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.259829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.259839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.259854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.259864 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.362409 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.362466 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.362479 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.362498 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.362511 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.464967 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.465748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.465789 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.465849 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.465876 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.568688 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.568741 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.568757 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.568776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.568788 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.671878 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.671927 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.671942 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.671959 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.671971 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.774574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.774638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.774652 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.774670 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.774683 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.863528 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.863612 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:04 crc kubenswrapper[4760]: E1204 12:15:04.863705 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.863770 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:04 crc kubenswrapper[4760]: E1204 12:15:04.863912 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:04 crc kubenswrapper[4760]: E1204 12:15:04.864038 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.878029 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.878099 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.878122 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.878142 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.878156 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.981169 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.981261 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.981275 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.981333 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:04 crc kubenswrapper[4760]: I1204 12:15:04.981353 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:04Z","lastTransitionTime":"2025-12-04T12:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.084191 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.084273 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.084289 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.084306 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.084315 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.186980 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.187020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.187030 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.187047 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.187057 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.289679 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.289731 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.289744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.289761 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.289775 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.393242 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.393294 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.393303 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.393317 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.393327 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.496321 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.496377 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.496389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.496406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.496416 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.599104 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.599156 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.599164 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.599178 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.599187 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.701240 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.701299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.701315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.701340 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.701354 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.804139 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.804182 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.804194 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.804226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.804239 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.863605 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:05 crc kubenswrapper[4760]: E1204 12:15:05.863954 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.906794 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.906833 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.906842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.906855 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:05 crc kubenswrapper[4760]: I1204 12:15:05.906863 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:05Z","lastTransitionTime":"2025-12-04T12:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.010108 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.010151 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.010159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.010175 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.010185 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.114088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.114140 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.114152 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.114169 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.114183 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.219302 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.219352 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.219362 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.219380 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.219396 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.323300 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.323365 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.323375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.323398 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.323412 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.425629 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.425727 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.425736 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.425752 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.425762 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.527998 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.528064 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.528073 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.528088 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.528099 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.630873 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.630950 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.630960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.631008 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.631034 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.733823 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.733867 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.733878 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.733892 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.733903 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.836114 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.836157 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.836169 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.836186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.836200 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.863920 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.863981 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:06 crc kubenswrapper[4760]: E1204 12:15:06.864048 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:06 crc kubenswrapper[4760]: E1204 12:15:06.864179 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.864370 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:06 crc kubenswrapper[4760]: E1204 12:15:06.864641 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.940779 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.940887 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.941256 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.941295 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:06 crc kubenswrapper[4760]: I1204 12:15:06.941309 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:06Z","lastTransitionTime":"2025-12-04T12:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.044908 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.044983 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.045001 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.045025 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.045037 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.149033 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.149070 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.149082 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.149098 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.149111 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.252798 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.252842 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.252861 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.252880 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.252891 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.355715 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.355744 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.355755 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.355771 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.355783 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.458312 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.458366 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.458385 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.458406 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.458419 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.562113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.562159 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.562168 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.562186 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.562202 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.664472 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.664529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.664544 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.664563 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.664571 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.767944 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.767993 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.768004 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.768020 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.768031 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.863706 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:07 crc kubenswrapper[4760]: E1204 12:15:07.864150 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.872730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.872775 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.872784 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.872801 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.872815 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.923959 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.923937594 podStartE2EDuration="1m23.923937594s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:07.904163876 +0000 UTC m=+110.945610443" watchObservedRunningTime="2025-12-04 12:15:07.923937594 +0000 UTC m=+110.965384161" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.948041 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=62.948022862 podStartE2EDuration="1m2.948022862s" podCreationTimestamp="2025-12-04 12:14:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:07.923850021 +0000 UTC m=+110.965296598" watchObservedRunningTime="2025-12-04 12:15:07.948022862 +0000 UTC m=+110.989469429" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.976163 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.976200 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.976226 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.976244 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:07 crc kubenswrapper[4760]: I1204 12:15:07.976254 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:07Z","lastTransitionTime":"2025-12-04T12:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.024403 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4br74" podStartSLOduration=85.024382101 podStartE2EDuration="1m25.024382101s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.005012034 +0000 UTC m=+111.046458611" watchObservedRunningTime="2025-12-04 12:15:08.024382101 +0000 UTC m=+111.065828668" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.024813 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4jxpg" podStartSLOduration=84.024807242 podStartE2EDuration="1m24.024807242s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.023977401 +0000 UTC m=+111.065423968" watchObservedRunningTime="2025-12-04 12:15:08.024807242 +0000 UTC m=+111.066253809" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.079291 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.079355 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.079372 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.079392 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.079404 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.085189 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.085170281 podStartE2EDuration="39.085170281s" podCreationTimestamp="2025-12-04 12:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.040974127 +0000 UTC m=+111.082420704" watchObservedRunningTime="2025-12-04 12:15:08.085170281 +0000 UTC m=+111.126616848" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.085332 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=85.085328285 podStartE2EDuration="1m25.085328285s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.084609846 +0000 UTC m=+111.126056413" watchObservedRunningTime="2025-12-04 12:15:08.085328285 +0000 UTC m=+111.126774852" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.149392 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bvk2c" podStartSLOduration=84.149358248 podStartE2EDuration="1m24.149358248s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.148548406 +0000 UTC m=+111.189994993" watchObservedRunningTime="2025-12-04 12:15:08.149358248 +0000 UTC m=+111.190804815" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.182375 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.182403 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.182412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.182425 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.182436 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.236763 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=84.2367421 podStartE2EDuration="1m24.2367421s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.217753722 +0000 UTC m=+111.259200289" watchObservedRunningTime="2025-12-04 12:15:08.2367421 +0000 UTC m=+111.278188667" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.277402 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podStartSLOduration=85.277378022 podStartE2EDuration="1m25.277378022s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.254103544 +0000 UTC m=+111.295550121" watchObservedRunningTime="2025-12-04 12:15:08.277378022 +0000 UTC m=+111.318824589" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.278466 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dg5hd" podStartSLOduration=84.27845835 podStartE2EDuration="1m24.27845835s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.277849094 +0000 UTC m=+111.319295671" watchObservedRunningTime="2025-12-04 12:15:08.27845835 +0000 UTC m=+111.319904917" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.284468 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.284529 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.284546 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.284566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.284598 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.293390 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2cfhd" podStartSLOduration=85.293369582 podStartE2EDuration="1m25.293369582s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:08.292464369 +0000 UTC m=+111.333910956" watchObservedRunningTime="2025-12-04 12:15:08.293369582 +0000 UTC m=+111.334816149" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.388740 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.388820 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.388832 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.388852 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.388866 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.493075 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.493167 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.493185 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.493234 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.493256 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.596878 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.596929 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.596941 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.596964 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.596983 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.699620 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.700169 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.700187 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.700224 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.700237 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.803236 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.803433 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.803450 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.803474 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.803488 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.864007 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.864071 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.864027 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:08 crc kubenswrapper[4760]: E1204 12:15:08.864262 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:08 crc kubenswrapper[4760]: E1204 12:15:08.864446 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:08 crc kubenswrapper[4760]: E1204 12:15:08.864554 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.906960 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.907119 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.907527 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.907566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:08 crc kubenswrapper[4760]: I1204 12:15:08.907583 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:08Z","lastTransitionTime":"2025-12-04T12:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.012300 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.012370 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.012389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.012415 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.012430 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.115810 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.115869 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.115886 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.115904 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.115916 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.219930 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.219973 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.220007 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.220027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.220042 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.324910 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.324994 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.325005 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.325022 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.325052 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.430135 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.430299 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.430315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.430347 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.430362 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.534046 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.534113 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.534126 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.534149 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.534167 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.637719 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.637829 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.637845 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.637871 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.637886 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.741464 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.741528 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.741541 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.741557 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.741566 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.845600 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.845647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.845657 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.845676 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.845686 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.863400 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:09 crc kubenswrapper[4760]: E1204 12:15:09.863618 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.950485 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.950553 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.950565 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.950583 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:09 crc kubenswrapper[4760]: I1204 12:15:09.950599 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:09Z","lastTransitionTime":"2025-12-04T12:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.053389 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.053486 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.053499 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.053518 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.053530 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.156061 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.156124 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.156141 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.156161 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.156173 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.259267 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.259310 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.259318 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.259335 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.259346 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.362504 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.362550 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.362559 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.362574 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.362586 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.465548 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.465597 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.465615 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.465637 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.465650 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.569190 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.569281 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.569296 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.569322 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.569334 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.672821 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.672934 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.672962 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.672987 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.673005 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.776766 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.776827 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.776839 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.776854 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.776864 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.863302 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.863404 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.863358 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:10 crc kubenswrapper[4760]: E1204 12:15:10.863547 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:10 crc kubenswrapper[4760]: E1204 12:15:10.863693 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:10 crc kubenswrapper[4760]: E1204 12:15:10.863798 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.879698 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.879748 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.879758 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.879774 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.879783 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.983835 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.984562 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.984587 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.984618 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:10 crc kubenswrapper[4760]: I1204 12:15:10.984636 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:10Z","lastTransitionTime":"2025-12-04T12:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.088542 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.088611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.088625 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.088646 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.088659 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.191566 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.191623 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.191639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.191662 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.191676 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.294543 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.294580 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.294591 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.294611 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.294630 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.397730 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.397781 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.397793 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.397809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.397818 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.500338 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.500387 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.500397 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.500412 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.500460 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.603402 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.603455 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.603467 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.603484 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.603496 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.705639 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.705694 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.705707 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.705728 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.705742 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.808560 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.808621 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.808632 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.808654 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.808666 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.863727 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:11 crc kubenswrapper[4760]: E1204 12:15:11.863893 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.911834 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.911886 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.911895 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.911911 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:11 crc kubenswrapper[4760]: I1204 12:15:11.911921 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:11Z","lastTransitionTime":"2025-12-04T12:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.014454 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.014492 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.014503 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.014519 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.014531 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.117722 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.117776 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.117788 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.117809 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.117822 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.220374 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.220469 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.220480 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.220497 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.220510 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.324017 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.324093 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.324105 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.324127 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.324141 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.426395 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.426452 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.426471 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.426489 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.426500 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.529589 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.529638 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.529647 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.529663 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.529675 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.632323 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.632396 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.632408 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.632428 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.632441 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.735420 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.735487 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.735502 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.735524 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.735537 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.837894 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.837954 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.837966 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.837981 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.837990 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.863549 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.863608 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.863633 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:12 crc kubenswrapper[4760]: E1204 12:15:12.863686 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:12 crc kubenswrapper[4760]: E1204 12:15:12.863779 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:12 crc kubenswrapper[4760]: E1204 12:15:12.863820 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.864453 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:15:12 crc kubenswrapper[4760]: E1204 12:15:12.864607 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.882251 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.882315 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.882326 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.882345 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.882362 4760 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T12:15:12Z","lastTransitionTime":"2025-12-04T12:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.934969 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw"] Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.935344 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.937950 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.938013 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.938313 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 12:15:12 crc kubenswrapper[4760]: I1204 12:15:12.940247 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.059429 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.059487 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.059557 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.059593 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.059624 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160600 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160756 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160791 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160809 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160891 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.160829 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.161615 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.168225 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.179441 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48ddd1d1-15a4-44dc-be2c-ec764feb9ae1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dlfsw\" (UID: \"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.249661 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" Dec 04 12:15:13 crc kubenswrapper[4760]: W1204 12:15:13.265853 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ddd1d1_15a4_44dc_be2c_ec764feb9ae1.slice/crio-2f7b28973084397c19c057faa3a96de1578df33376ee7739be0022874a208ee2 WatchSource:0}: Error finding container 2f7b28973084397c19c057faa3a96de1578df33376ee7739be0022874a208ee2: Status 404 returned error can't find the container with id 2f7b28973084397c19c057faa3a96de1578df33376ee7739be0022874a208ee2 Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.631682 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" event={"ID":"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1","Type":"ContainerStarted","Data":"b8e24b07ddd9cf99c3437d3af91068233a4f14795b2773c0388c0d547723c11d"} Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.631743 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" event={"ID":"48ddd1d1-15a4-44dc-be2c-ec764feb9ae1","Type":"ContainerStarted","Data":"2f7b28973084397c19c057faa3a96de1578df33376ee7739be0022874a208ee2"} Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.646784 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dlfsw" podStartSLOduration=90.646763293 podStartE2EDuration="1m30.646763293s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:13.646611208 +0000 UTC m=+116.688057775" watchObservedRunningTime="2025-12-04 12:15:13.646763293 +0000 UTC m=+116.688209870" Dec 04 12:15:13 crc kubenswrapper[4760]: I1204 12:15:13.864057 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:13 crc kubenswrapper[4760]: E1204 12:15:13.864238 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:14 crc kubenswrapper[4760]: I1204 12:15:14.863639 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:14 crc kubenswrapper[4760]: I1204 12:15:14.863702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:14 crc kubenswrapper[4760]: I1204 12:15:14.863747 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:14 crc kubenswrapper[4760]: E1204 12:15:14.863771 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:14 crc kubenswrapper[4760]: E1204 12:15:14.863829 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:14 crc kubenswrapper[4760]: E1204 12:15:14.863900 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:15 crc kubenswrapper[4760]: I1204 12:15:15.864076 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:15 crc kubenswrapper[4760]: E1204 12:15:15.864284 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:16 crc kubenswrapper[4760]: I1204 12:15:16.863540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:16 crc kubenswrapper[4760]: I1204 12:15:16.863641 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:16 crc kubenswrapper[4760]: I1204 12:15:16.863692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:16 crc kubenswrapper[4760]: E1204 12:15:16.863796 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:16 crc kubenswrapper[4760]: E1204 12:15:16.863882 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:16 crc kubenswrapper[4760]: E1204 12:15:16.864001 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:17 crc kubenswrapper[4760]: I1204 12:15:17.863243 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:17 crc kubenswrapper[4760]: E1204 12:15:17.864111 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:17 crc kubenswrapper[4760]: E1204 12:15:17.885699 4760 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 04 12:15:18 crc kubenswrapper[4760]: E1204 12:15:18.112039 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:18 crc kubenswrapper[4760]: I1204 12:15:18.863763 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:18 crc kubenswrapper[4760]: I1204 12:15:18.863879 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:18 crc kubenswrapper[4760]: E1204 12:15:18.863912 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:18 crc kubenswrapper[4760]: I1204 12:15:18.863792 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:18 crc kubenswrapper[4760]: E1204 12:15:18.864080 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:18 crc kubenswrapper[4760]: E1204 12:15:18.864177 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:19 crc kubenswrapper[4760]: I1204 12:15:19.863554 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:19 crc kubenswrapper[4760]: E1204 12:15:19.863704 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:20 crc kubenswrapper[4760]: I1204 12:15:20.864182 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:20 crc kubenswrapper[4760]: I1204 12:15:20.864263 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:20 crc kubenswrapper[4760]: E1204 12:15:20.864703 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:20 crc kubenswrapper[4760]: E1204 12:15:20.864845 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:20 crc kubenswrapper[4760]: I1204 12:15:20.864270 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:20 crc kubenswrapper[4760]: E1204 12:15:20.864955 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:21 crc kubenswrapper[4760]: I1204 12:15:21.863368 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:21 crc kubenswrapper[4760]: E1204 12:15:21.863540 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.663898 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/1.log" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.664609 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/0.log" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.664694 4760 generic.go:334] "Generic (PLEG): container finished" podID="017b9fc1-6db4-4786-81f1-6cb9b09c90a3" containerID="d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887" exitCode=1 Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.664757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerDied","Data":"d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887"} Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.664815 4760 scope.go:117] "RemoveContainer" containerID="1c27079c1c72797970de51229932ba48f8cd5cb21ce5cd43709619473a839249" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.665294 4760 scope.go:117] "RemoveContainer" containerID="d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887" Dec 04 12:15:22 crc kubenswrapper[4760]: E1204 12:15:22.665481 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-dg5hd_openshift-multus(017b9fc1-6db4-4786-81f1-6cb9b09c90a3)\"" pod="openshift-multus/multus-dg5hd" podUID="017b9fc1-6db4-4786-81f1-6cb9b09c90a3" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.863648 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:22 crc kubenswrapper[4760]: E1204 12:15:22.863845 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.864081 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:22 crc kubenswrapper[4760]: E1204 12:15:22.864147 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:22 crc kubenswrapper[4760]: I1204 12:15:22.864320 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:22 crc kubenswrapper[4760]: E1204 12:15:22.864401 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:23 crc kubenswrapper[4760]: E1204 12:15:23.114009 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:23 crc kubenswrapper[4760]: I1204 12:15:23.669010 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/1.log" Dec 04 12:15:23 crc kubenswrapper[4760]: I1204 12:15:23.864351 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:23 crc kubenswrapper[4760]: E1204 12:15:23.864505 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:24 crc kubenswrapper[4760]: I1204 12:15:24.863429 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:24 crc kubenswrapper[4760]: I1204 12:15:24.863473 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:24 crc kubenswrapper[4760]: I1204 12:15:24.863429 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:24 crc kubenswrapper[4760]: E1204 12:15:24.863810 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:24 crc kubenswrapper[4760]: E1204 12:15:24.863905 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:24 crc kubenswrapper[4760]: E1204 12:15:24.863988 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:25 crc kubenswrapper[4760]: I1204 12:15:25.863895 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:25 crc kubenswrapper[4760]: E1204 12:15:25.864044 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:26 crc kubenswrapper[4760]: I1204 12:15:26.863839 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:26 crc kubenswrapper[4760]: I1204 12:15:26.863876 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:26 crc kubenswrapper[4760]: I1204 12:15:26.863905 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:26 crc kubenswrapper[4760]: E1204 12:15:26.864105 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:26 crc kubenswrapper[4760]: E1204 12:15:26.864190 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:26 crc kubenswrapper[4760]: E1204 12:15:26.864297 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:26 crc kubenswrapper[4760]: I1204 12:15:26.864886 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:15:26 crc kubenswrapper[4760]: E1204 12:15:26.865020 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-q8b49_openshift-ovn-kubernetes(69907424-ac0b-4430-b508-af165754104f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" Dec 04 12:15:27 crc kubenswrapper[4760]: I1204 12:15:27.863515 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:27 crc kubenswrapper[4760]: E1204 12:15:27.864517 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:28 crc kubenswrapper[4760]: E1204 12:15:28.114561 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:28 crc kubenswrapper[4760]: I1204 12:15:28.863947 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:28 crc kubenswrapper[4760]: I1204 12:15:28.863951 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:28 crc kubenswrapper[4760]: E1204 12:15:28.864118 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:28 crc kubenswrapper[4760]: E1204 12:15:28.864244 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:28 crc kubenswrapper[4760]: I1204 12:15:28.863956 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:28 crc kubenswrapper[4760]: E1204 12:15:28.864330 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:29 crc kubenswrapper[4760]: I1204 12:15:29.864089 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:29 crc kubenswrapper[4760]: E1204 12:15:29.864292 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:30 crc kubenswrapper[4760]: I1204 12:15:30.863542 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:30 crc kubenswrapper[4760]: E1204 12:15:30.863665 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:30 crc kubenswrapper[4760]: I1204 12:15:30.863564 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:30 crc kubenswrapper[4760]: E1204 12:15:30.863731 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:30 crc kubenswrapper[4760]: I1204 12:15:30.863542 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:30 crc kubenswrapper[4760]: E1204 12:15:30.863778 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:31 crc kubenswrapper[4760]: I1204 12:15:31.864070 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:31 crc kubenswrapper[4760]: E1204 12:15:31.864270 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:32 crc kubenswrapper[4760]: I1204 12:15:32.864551 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:32 crc kubenswrapper[4760]: I1204 12:15:32.864182 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:32 crc kubenswrapper[4760]: I1204 12:15:32.864557 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:32 crc kubenswrapper[4760]: E1204 12:15:32.864901 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:32 crc kubenswrapper[4760]: E1204 12:15:32.864995 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:32 crc kubenswrapper[4760]: E1204 12:15:32.865020 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:33 crc kubenswrapper[4760]: E1204 12:15:33.116150 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:33 crc kubenswrapper[4760]: I1204 12:15:33.863556 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:33 crc kubenswrapper[4760]: E1204 12:15:33.864003 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:34 crc kubenswrapper[4760]: I1204 12:15:34.863266 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:34 crc kubenswrapper[4760]: I1204 12:15:34.863270 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:34 crc kubenswrapper[4760]: I1204 12:15:34.863438 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:34 crc kubenswrapper[4760]: E1204 12:15:34.863603 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:34 crc kubenswrapper[4760]: E1204 12:15:34.863711 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:34 crc kubenswrapper[4760]: E1204 12:15:34.864033 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:35 crc kubenswrapper[4760]: I1204 12:15:35.864119 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:35 crc kubenswrapper[4760]: E1204 12:15:35.864344 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:36 crc kubenswrapper[4760]: I1204 12:15:36.863675 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:36 crc kubenswrapper[4760]: I1204 12:15:36.863732 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:36 crc kubenswrapper[4760]: I1204 12:15:36.863800 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:36 crc kubenswrapper[4760]: E1204 12:15:36.863864 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:36 crc kubenswrapper[4760]: E1204 12:15:36.864057 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:36 crc kubenswrapper[4760]: I1204 12:15:36.864102 4760 scope.go:117] "RemoveContainer" containerID="d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887" Dec 04 12:15:36 crc kubenswrapper[4760]: E1204 12:15:36.864175 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:37 crc kubenswrapper[4760]: I1204 12:15:37.725632 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/1.log" Dec 04 12:15:37 crc kubenswrapper[4760]: I1204 12:15:37.725959 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerStarted","Data":"22849b4c74cfeea314c9800b164c42a8941c66b08bb09b8eea11b3bdd74ec348"} Dec 04 12:15:37 crc kubenswrapper[4760]: I1204 12:15:37.863374 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:37 crc kubenswrapper[4760]: E1204 12:15:37.864371 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:38 crc kubenswrapper[4760]: E1204 12:15:38.116680 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:38 crc kubenswrapper[4760]: I1204 12:15:38.863629 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:38 crc kubenswrapper[4760]: I1204 12:15:38.863734 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:38 crc kubenswrapper[4760]: I1204 12:15:38.863753 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:38 crc kubenswrapper[4760]: E1204 12:15:38.863872 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:38 crc kubenswrapper[4760]: E1204 12:15:38.863981 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:38 crc kubenswrapper[4760]: E1204 12:15:38.864121 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:39 crc kubenswrapper[4760]: I1204 12:15:39.865649 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:39 crc kubenswrapper[4760]: E1204 12:15:39.865777 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:40 crc kubenswrapper[4760]: I1204 12:15:40.863067 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:40 crc kubenswrapper[4760]: I1204 12:15:40.863086 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:40 crc kubenswrapper[4760]: I1204 12:15:40.863251 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:40 crc kubenswrapper[4760]: E1204 12:15:40.863184 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:40 crc kubenswrapper[4760]: E1204 12:15:40.863381 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:40 crc kubenswrapper[4760]: E1204 12:15:40.863436 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:41 crc kubenswrapper[4760]: I1204 12:15:41.863876 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:41 crc kubenswrapper[4760]: E1204 12:15:41.864037 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:41 crc kubenswrapper[4760]: I1204 12:15:41.865361 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.713768 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xpngr"] Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.745749 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/3.log" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.747857 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:42 crc kubenswrapper[4760]: E1204 12:15:42.747958 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.748093 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerStarted","Data":"57ad450cca5fc659a67cf074a73ec88c6bc591926b1636f59908fba3d9f25a69"} Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.748785 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.776915 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podStartSLOduration=118.776895906 podStartE2EDuration="1m58.776895906s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:42.776118691 +0000 UTC m=+145.817565268" watchObservedRunningTime="2025-12-04 12:15:42.776895906 +0000 UTC m=+145.818342473" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.863199 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:42 crc kubenswrapper[4760]: E1204 12:15:42.864035 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.863586 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:42 crc kubenswrapper[4760]: E1204 12:15:42.864116 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:42 crc kubenswrapper[4760]: I1204 12:15:42.863542 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:42 crc kubenswrapper[4760]: E1204 12:15:42.864764 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:43 crc kubenswrapper[4760]: E1204 12:15:43.117722 4760 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 12:15:44 crc kubenswrapper[4760]: I1204 12:15:44.864099 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:44 crc kubenswrapper[4760]: I1204 12:15:44.864137 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:44 crc kubenswrapper[4760]: I1204 12:15:44.864150 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:44 crc kubenswrapper[4760]: I1204 12:15:44.864266 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:44 crc kubenswrapper[4760]: E1204 12:15:44.864264 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:44 crc kubenswrapper[4760]: E1204 12:15:44.864397 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:44 crc kubenswrapper[4760]: E1204 12:15:44.864498 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:44 crc kubenswrapper[4760]: E1204 12:15:44.864614 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:46 crc kubenswrapper[4760]: I1204 12:15:46.864132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:46 crc kubenswrapper[4760]: I1204 12:15:46.864167 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:46 crc kubenswrapper[4760]: I1204 12:15:46.864195 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:46 crc kubenswrapper[4760]: I1204 12:15:46.864143 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:46 crc kubenswrapper[4760]: E1204 12:15:46.864333 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xpngr" podUID="b4fd6a47-556a-4236-9f60-0e7996e4608a" Dec 04 12:15:46 crc kubenswrapper[4760]: E1204 12:15:46.864436 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 12:15:46 crc kubenswrapper[4760]: E1204 12:15:46.864603 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 12:15:46 crc kubenswrapper[4760]: E1204 12:15:46.864707 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.863323 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.863357 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.863489 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.864413 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867120 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867278 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867290 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867340 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867466 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 12:15:48 crc kubenswrapper[4760]: I1204 12:15:48.867627 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.179372 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:51 crc kubenswrapper[4760]: E1204 12:15:51.179589 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:17:53.179560893 +0000 UTC m=+276.221007460 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.179868 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.179929 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.186484 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.281123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.281327 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.284349 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.284882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.289486 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.578686 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:51 crc kubenswrapper[4760]: I1204 12:15:51.931322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:52 crc kubenswrapper[4760]: W1204 12:15:52.155439 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-7a5e4f242d941c05f718d9e77da0059301cd609aff83f91fc2bb2c9f5510a844 WatchSource:0}: Error finding container 7a5e4f242d941c05f718d9e77da0059301cd609aff83f91fc2bb2c9f5510a844: Status 404 returned error can't find the container with id 7a5e4f242d941c05f718d9e77da0059301cd609aff83f91fc2bb2c9f5510a844 Dec 04 12:15:52 crc kubenswrapper[4760]: W1204 12:15:52.171137 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b000af9b53d4a778ee310954eea5cb8dcbac87dc5b028be2f18f3939f460e1db WatchSource:0}: Error finding container b000af9b53d4a778ee310954eea5cb8dcbac87dc5b028be2f18f3939f460e1db: Status 404 returned error can't find the container with id b000af9b53d4a778ee310954eea5cb8dcbac87dc5b028be2f18f3939f460e1db Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.204174 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 12:15:52 crc kubenswrapper[4760]: W1204 12:15:52.404410 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9e04603ad382870a53cb205dea2ca7be38cf2e74c24d4c41bccd0c9eeb13fc7e WatchSource:0}: Error finding container 9e04603ad382870a53cb205dea2ca7be38cf2e74c24d4c41bccd0c9eeb13fc7e: Status 404 returned error can't find the container with id 9e04603ad382870a53cb205dea2ca7be38cf2e74c24d4c41bccd0c9eeb13fc7e Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.784849 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"874b7071593951f31ba2dbf6f387029712ff713e0a6ff8d101e96641f68b61b8"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.784912 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9e04603ad382870a53cb205dea2ca7be38cf2e74c24d4c41bccd0c9eeb13fc7e"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.786823 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"55db700062d52e6716d6764c9bc955a8a8e74d3c8c871955579a0d0404aa650d"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.786870 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b000af9b53d4a778ee310954eea5cb8dcbac87dc5b028be2f18f3939f460e1db"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.788736 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dbe44072ef5bdb7d4506aee9e16057c03637e11720903277fca8c14da50ba3ca"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.788780 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7a5e4f242d941c05f718d9e77da0059301cd609aff83f91fc2bb2c9f5510a844"} Dec 04 12:15:52 crc kubenswrapper[4760]: I1204 12:15:52.788943 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.931027 4760 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.986045 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jz7p7"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.986926 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.989250 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghltq"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.989832 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.990329 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.990373 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.991858 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.992345 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.992413 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.992571 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.992618 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.992616 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.993040 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.993788 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.994138 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.994244 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.994339 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.994412 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.995098 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.996853 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cn9kc"] Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.997422 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.997680 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 12:15:53 crc kubenswrapper[4760]: I1204 12:15:53.999703 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.000134 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.001787 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.001916 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.002030 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.002295 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.006492 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nc52r"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.007560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008607 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008625 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jkc\" (UniqueName: \"kubernetes.io/projected/a6b7fabb-b00b-41d3-9a63-291959a7c157-kube-api-access-q6jkc\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008654 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-serving-cert\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008673 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-node-pullsecrets\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008697 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008713 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008748 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-encryption-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008763 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-machine-approver-tls\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-trusted-ca\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-audit\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008804 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-policies\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008818 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008832 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-config\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008847 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-image-import-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008860 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008874 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-dir\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008888 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008903 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008920 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-audit-dir\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008944 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-config\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008958 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008976 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rl8\" (UniqueName: \"kubernetes.io/projected/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-kube-api-access-44rl8\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.008991 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009005 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92pb\" (UniqueName: \"kubernetes.io/projected/2914d4c6-5bae-4d02-aaf6-13556172e946-kube-api-access-h92pb\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009020 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-client\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009037 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009059 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-serving-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009075 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-service-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009090 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgk7\" (UniqueName: \"kubernetes.io/projected/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-kube-api-access-qjgk7\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009105 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-config\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009119 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-serving-cert\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009137 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009145 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rkg\" (UniqueName: \"kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009447 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fft7w\" (UniqueName: \"kubernetes.io/projected/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-kube-api-access-fft7w\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009491 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-encryption-config\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009508 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6b7fabb-b00b-41d3-9a63-291959a7c157-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009521 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-auth-proxy-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009534 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-client\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009548 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-serving-cert\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009563 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-serving-cert\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2mn\" (UniqueName: \"kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009591 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89ft8\" (UniqueName: \"kubernetes.io/projected/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-kube-api-access-89ft8\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009606 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-images\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009620 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009635 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.009785 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmjcx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.010786 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.011284 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2qtg7"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.011612 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.012418 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.012887 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013563 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013627 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013663 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013695 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013776 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013539 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013938 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013636 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.013576 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.014249 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.014287 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.015541 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.015652 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.015744 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.015914 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.015945 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.017136 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.018338 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.018675 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.018896 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.019110 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.029272 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.031375 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.031955 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.032304 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.032618 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.032806 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.033133 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.033851 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.034072 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.037808 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.038273 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.038637 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.038929 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.039185 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.046106 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.061561 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.061651 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.065815 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dn2jn"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.065990 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066264 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066449 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066556 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066628 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066670 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066746 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066845 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.066936 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067027 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067040 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067064 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067127 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067232 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067293 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067333 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067382 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067468 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067482 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067548 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067592 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067631 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067677 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067762 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067868 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.067960 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068119 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068135 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068270 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068394 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068499 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068526 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068537 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068537 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7hmd"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068590 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068654 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068888 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.068893 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.071464 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.072361 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.072696 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.072826 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.073337 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.073997 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.074299 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.074814 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.075478 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.076011 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.080494 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.084082 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.084822 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.085115 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.088052 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.089484 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.092551 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.093565 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.101370 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.101921 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.102352 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.103074 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.103757 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.106706 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.106759 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.109659 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.110087 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.128084 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.128713 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.128876 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129399 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129462 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129531 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129890 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129938 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.129978 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgd6\" (UniqueName: \"kubernetes.io/projected/94f4372f-7c56-4cf5-9384-ae555845f15a-kube-api-access-pdgd6\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130055 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5qf\" (UniqueName: \"kubernetes.io/projected/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-kube-api-access-cl5qf\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-serving-cert\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-encryption-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130451 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-machine-approver-tls\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130761 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-trusted-ca\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130800 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-audit\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130831 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-policies\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.130860 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-config\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131157 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131173 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131317 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-config\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n9qz\" (UniqueName: \"kubernetes.io/projected/760a4690-8674-4847-af35-49eec4903ef2-kube-api-access-5n9qz\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131377 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131675 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2t4\" (UniqueName: \"kubernetes.io/projected/397d3069-2845-40f6-bbb9-d2541d0f3f80-kube-api-access-qj2t4\") pod \"downloads-7954f5f757-dn2jn\" (UID: \"397d3069-2845-40f6-bbb9-d2541d0f3f80\") " pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131712 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-image-import-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-dir\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131804 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131841 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgvp\" (UniqueName: \"kubernetes.io/projected/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-kube-api-access-fpgvp\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.131875 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132023 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-audit-dir\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132080 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-client\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132443 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-config\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132500 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65668b2d-24cd-4a5f-8b00-ed3778e134fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132535 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132770 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rl8\" (UniqueName: \"kubernetes.io/projected/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-kube-api-access-44rl8\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132832 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132881 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132933 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88hkx\" (UniqueName: \"kubernetes.io/projected/af58c416-6966-498a-9e34-9cd879d3a21c-kube-api-access-88hkx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132965 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.132994 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f4372f-7c56-4cf5-9384-ae555845f15a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133021 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133547 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65668b2d-24cd-4a5f-8b00-ed3778e134fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133579 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133609 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92pb\" (UniqueName: \"kubernetes.io/projected/2914d4c6-5bae-4d02-aaf6-13556172e946-kube-api-access-h92pb\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133645 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-client\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133665 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133702 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133734 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpscn\" (UniqueName: \"kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133865 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133895 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-serving-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133920 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-service-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133960 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-service-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.133997 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-config\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134027 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65668b2d-24cd-4a5f-8b00-ed3778e134fd-config\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134055 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgk7\" (UniqueName: \"kubernetes.io/projected/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-kube-api-access-qjgk7\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134220 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-serving-cert\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134245 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134265 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af58c416-6966-498a-9e34-9cd879d3a21c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134365 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/760a4690-8674-4847-af35-49eec4903ef2-metrics-tls\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134390 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f4372f-7c56-4cf5-9384-ae555845f15a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rkg\" (UniqueName: \"kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134494 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fft7w\" (UniqueName: \"kubernetes.io/projected/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-kube-api-access-fft7w\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134524 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134599 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134744 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6dt\" (UniqueName: \"kubernetes.io/projected/78d9eb51-de42-4536-a561-aee39bfd92f3-kube-api-access-8w6dt\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134770 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f75j\" (UniqueName: \"kubernetes.io/projected/55d5b8e1-f975-4db6-a2e5-4d5f40dff81c-kube-api-access-7f75j\") pod \"migrator-59844c95c7-nwh78\" (UID: \"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134828 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d9eb51-de42-4536-a561-aee39bfd92f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134852 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134870 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-encryption-config\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134891 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134916 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-serving-cert\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134937 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134963 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6b7fabb-b00b-41d3-9a63-291959a7c157-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.134987 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-auth-proxy-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135025 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-client\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-serving-cert\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135074 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f89a4f-bf24-4abe-a62b-af295dc6f208-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135125 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135148 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af58c416-6966-498a-9e34-9cd879d3a21c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2mn\" (UniqueName: \"kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135253 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76f89a4f-bf24-4abe-a62b-af295dc6f208-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-serving-cert\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135303 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89ft8\" (UniqueName: \"kubernetes.io/projected/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-kube-api-access-89ft8\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135462 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135536 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135575 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-images\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135625 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm9db\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-kube-api-access-hm9db\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135671 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135723 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcsq\" (UniqueName: \"kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135828 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135874 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jkc\" (UniqueName: \"kubernetes.io/projected/a6b7fabb-b00b-41d3-9a63-291959a7c157-kube-api-access-q6jkc\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135937 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-serving-cert\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.135970 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.136006 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkjmp\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-kube-api-access-vkjmp\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.136059 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.136101 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-node-pullsecrets\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.136145 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.142535 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.154578 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.161138 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-42rhb"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.161519 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.161835 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.161947 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-encryption-config\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.162008 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.162536 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.162735 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-image-import-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.162852 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.163108 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.163797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.164023 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.164847 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-config\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.165305 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-config\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.165691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.166171 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-service-ca-bundle\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.167817 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-serving-ca\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.168789 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.172523 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.173961 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-policies\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174094 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174274 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-audit-dir\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174424 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-audit-dir\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174567 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2914d4c6-5bae-4d02-aaf6-13556172e946-node-pullsecrets\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-auth-proxy-config\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.174979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.177260 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-serving-cert\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.179384 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.180129 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nw7pr"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.180148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-serving-cert\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.180648 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.181018 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-images\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.181141 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.182771 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-audit\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.183447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.185399 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.186634 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2914d4c6-5bae-4d02-aaf6-13556172e946-trusted-ca-bundle\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.189001 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.189592 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-trusted-ca\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.191039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-etcd-client\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.191343 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-serving-cert\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.192365 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.192545 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.193408 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.194473 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.195588 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.195758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6b7fabb-b00b-41d3-9a63-291959a7c157-config\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.195820 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.196323 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.210912 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.212963 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.213870 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.214926 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.216050 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.219030 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.219750 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-machine-approver-tls\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.219933 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.220092 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.221555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6b7fabb-b00b-41d3-9a63-291959a7c157-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.221597 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-etcd-client\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.222186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2914d4c6-5bae-4d02-aaf6-13556172e946-serving-cert\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.222619 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.223401 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.226677 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.227503 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.232304 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.232618 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.233491 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.235632 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpxt2"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.236558 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.237067 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.237512 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.238731 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-encryption-config\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248475 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-client\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248518 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65668b2d-24cd-4a5f-8b00-ed3778e134fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248549 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248568 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248590 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88hkx\" (UniqueName: \"kubernetes.io/projected/af58c416-6966-498a-9e34-9cd879d3a21c-kube-api-access-88hkx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248613 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248643 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248660 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f4372f-7c56-4cf5-9384-ae555845f15a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248683 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65668b2d-24cd-4a5f-8b00-ed3778e134fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248702 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248751 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248780 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpscn\" (UniqueName: \"kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248809 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-service-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248862 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65668b2d-24cd-4a5f-8b00-ed3778e134fd-config\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248881 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248903 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248923 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af58c416-6966-498a-9e34-9cd879d3a21c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248942 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/760a4690-8674-4847-af35-49eec4903ef2-metrics-tls\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.248960 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f4372f-7c56-4cf5-9384-ae555845f15a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249070 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f75j\" (UniqueName: \"kubernetes.io/projected/55d5b8e1-f975-4db6-a2e5-4d5f40dff81c-kube-api-access-7f75j\") pod \"migrator-59844c95c7-nwh78\" (UID: \"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d9eb51-de42-4536-a561-aee39bfd92f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249145 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6dt\" (UniqueName: \"kubernetes.io/projected/78d9eb51-de42-4536-a561-aee39bfd92f3-kube-api-access-8w6dt\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249171 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249191 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-serving-cert\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249223 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249249 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249264 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af58c416-6966-498a-9e34-9cd879d3a21c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249287 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f89a4f-bf24-4abe-a62b-af295dc6f208-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76f89a4f-bf24-4abe-a62b-af295dc6f208-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249347 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm9db\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-kube-api-access-hm9db\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249383 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249399 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcsq\" (UniqueName: \"kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249430 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249449 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkjmp\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-kube-api-access-vkjmp\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249469 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249493 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249510 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249533 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.249552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251439 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgd6\" (UniqueName: \"kubernetes.io/projected/94f4372f-7c56-4cf5-9384-ae555845f15a-kube-api-access-pdgd6\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251518 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5qf\" (UniqueName: \"kubernetes.io/projected/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-kube-api-access-cl5qf\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251596 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-serving-cert\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251778 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-config\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251828 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.251880 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.252117 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n9qz\" (UniqueName: \"kubernetes.io/projected/760a4690-8674-4847-af35-49eec4903ef2-kube-api-access-5n9qz\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.252634 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.253600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.255178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.255310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2t4\" (UniqueName: \"kubernetes.io/projected/397d3069-2845-40f6-bbb9-d2541d0f3f80-kube-api-access-qj2t4\") pod \"downloads-7954f5f757-dn2jn\" (UID: \"397d3069-2845-40f6-bbb9-d2541d0f3f80\") " pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.255380 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgvp\" (UniqueName: \"kubernetes.io/projected/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-kube-api-access-fpgvp\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.255601 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.256239 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.257991 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94f4372f-7c56-4cf5-9384-ae555845f15a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.258978 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.259412 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.259840 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.260430 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.260449 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.261121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.261441 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.261530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-client\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.262658 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.262830 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.263314 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.264079 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-serving-cert\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.264516 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.264549 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.264798 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghltq"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.264965 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.266140 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/76f89a4f-bf24-4abe-a62b-af295dc6f208-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.266851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.268007 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-config\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.268847 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jlmjr"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.269688 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-etcd-service-ca\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.269897 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-serving-cert\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.270513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.270766 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.271135 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.271643 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d9eb51-de42-4536-a561-aee39bfd92f3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.271790 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/760a4690-8674-4847-af35-49eec4903ef2-metrics-tls\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.272636 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.272735 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.273754 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jz7p7"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.274827 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.274845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.275713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.275911 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.276719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94f4372f-7c56-4cf5-9384-ae555845f15a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.276815 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nc52r"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.278101 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.290642 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.291895 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.292693 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f89a4f-bf24-4abe-a62b-af295dc6f208-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.294411 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.298233 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmjcx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.304265 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cn9kc"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.306570 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.307731 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.309488 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.312625 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.313873 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.318518 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.320114 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.321185 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7hmd"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.322665 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2qtg7"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.323887 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nw7pr"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.325340 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.326488 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.327632 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.328929 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.330238 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58wzk"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.333072 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.333198 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.333710 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.335270 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.336361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.337575 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.338759 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.339803 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mwp4x"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.340537 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.340965 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.342171 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.343169 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dn2jn"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.344319 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.345376 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.346669 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58wzk"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.347876 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.349161 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpxt2"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.349955 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.351858 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jlmjr"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.352013 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.354134 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f9rr2"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.355426 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.355438 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9rr2"] Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.369777 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.380757 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65668b2d-24cd-4a5f-8b00-ed3778e134fd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.389789 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.409909 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.411963 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65668b2d-24cd-4a5f-8b00-ed3778e134fd-config\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.430431 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.450351 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.471144 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.478510 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af58c416-6966-498a-9e34-9cd879d3a21c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.489717 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.500505 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af58c416-6966-498a-9e34-9cd879d3a21c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.509716 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.530341 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.551353 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.569497 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.593328 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.614410 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.629277 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.640323 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-metrics-tls\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.660847 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.670536 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.671111 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-trusted-ca\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.724782 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rkg\" (UniqueName: \"kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg\") pod \"route-controller-manager-6576b87f9c-xxt2f\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.747148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rl8\" (UniqueName: \"kubernetes.io/projected/7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15-kube-api-access-44rl8\") pod \"authentication-operator-69f744f599-cn9kc\" (UID: \"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.765071 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fft7w\" (UniqueName: \"kubernetes.io/projected/33a1d2dc-800d-4409-a3a7-927f1dd0cfd5-kube-api-access-fft7w\") pod \"apiserver-7bbb656c7d-2qbls\" (UID: \"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.769740 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.790196 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.809611 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.832650 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.836535 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.871088 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89ft8\" (UniqueName: \"kubernetes.io/projected/6c5f1da0-75e3-422b-b9e2-06f0bc29c720-kube-api-access-89ft8\") pod \"console-operator-58897d9998-nc52r\" (UID: \"6c5f1da0-75e3-422b-b9e2-06f0bc29c720\") " pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.887003 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2mn\" (UniqueName: \"kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn\") pod \"controller-manager-879f6c89f-7fpdd\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.908365 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92pb\" (UniqueName: \"kubernetes.io/projected/2914d4c6-5bae-4d02-aaf6-13556172e946-kube-api-access-h92pb\") pod \"apiserver-76f77b778f-jz7p7\" (UID: \"2914d4c6-5bae-4d02-aaf6-13556172e946\") " pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.914545 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.926341 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgk7\" (UniqueName: \"kubernetes.io/projected/d0f3af3b-1349-4450-8df4-9b3cd05e6d26-kube-api-access-qjgk7\") pod \"machine-approver-56656f9798-fdnjr\" (UID: \"d0f3af3b-1349-4450-8df4-9b3cd05e6d26\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.938801 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.967925 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jkc\" (UniqueName: \"kubernetes.io/projected/a6b7fabb-b00b-41d3-9a63-291959a7c157-kube-api-access-q6jkc\") pod \"machine-api-operator-5694c8668f-ghltq\" (UID: \"a6b7fabb-b00b-41d3-9a63-291959a7c157\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.970052 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.976621 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:54 crc kubenswrapper[4760]: I1204 12:15:54.990196 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.009720 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.017303 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.030661 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.046504 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.051164 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.075986 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.099700 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.110591 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.130522 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.152624 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.154471 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.170364 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.175518 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.187590 4760 request.go:700] Waited for 1.003886029s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-metrics-certs-default&limit=500&resourceVersion=0 Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.189380 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.209448 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.249334 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.251724 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.266501 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.269920 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.288493 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cn9kc"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.291190 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.310687 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.330766 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.349898 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.375689 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.389906 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.394178 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.411094 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.429841 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.442966 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-jz7p7"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.450220 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.473240 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.486539 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.496842 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: W1204 12:15:55.513849 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33a1d2dc_800d_4409_a3a7_927f1dd0cfd5.slice/crio-34a444d238c7d4800752a8130963d379a81f4b06ef9adb6b6720c2df54da5519 WatchSource:0}: Error finding container 34a444d238c7d4800752a8130963d379a81f4b06ef9adb6b6720c2df54da5519: Status 404 returned error can't find the container with id 34a444d238c7d4800752a8130963d379a81f4b06ef9adb6b6720c2df54da5519 Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.513982 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.525529 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nc52r"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.530353 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.550395 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.569613 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.596936 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.608964 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.630426 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.631776 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghltq"] Dec 04 12:15:55 crc kubenswrapper[4760]: W1204 12:15:55.641631 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b7fabb_b00b_41d3_9a63_291959a7c157.slice/crio-382890d83d9c88b6fcf222e7e4654c7d2f36944e05fa4555f633f7c5cc290b74 WatchSource:0}: Error finding container 382890d83d9c88b6fcf222e7e4654c7d2f36944e05fa4555f633f7c5cc290b74: Status 404 returned error can't find the container with id 382890d83d9c88b6fcf222e7e4654c7d2f36944e05fa4555f633f7c5cc290b74 Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.650501 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.665074 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.671716 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: W1204 12:15:55.680786 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c1d932_093d_416b_9c58_88ff3d559656.slice/crio-38b37d742ff3aa4166c846a73485934a79500ffe45508c05cb7f3359131bd850 WatchSource:0}: Error finding container 38b37d742ff3aa4166c846a73485934a79500ffe45508c05cb7f3359131bd850: Status 404 returned error can't find the container with id 38b37d742ff3aa4166c846a73485934a79500ffe45508c05cb7f3359131bd850 Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.689418 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.708981 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.731525 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.749612 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.770025 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.819867 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.831903 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f75j\" (UniqueName: \"kubernetes.io/projected/55d5b8e1-f975-4db6-a2e5-4d5f40dff81c-kube-api-access-7f75j\") pod \"migrator-59844c95c7-nwh78\" (UID: \"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.836502 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" event={"ID":"d0f3af3b-1349-4450-8df4-9b3cd05e6d26","Type":"ContainerStarted","Data":"da80bd67d4215a3858ba9b5b562501f25fdfba0d0e2f5b67d6cc00f10472ac04"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.836564 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" event={"ID":"d0f3af3b-1349-4450-8df4-9b3cd05e6d26","Type":"ContainerStarted","Data":"7602930bd56f53f6f7bc6241e628e06667e8690aa5dda7fc2d26b7e0705f4aaf"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.837840 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" event={"ID":"60c1d932-093d-416b-9c58-88ff3d559656","Type":"ContainerStarted","Data":"38b37d742ff3aa4166c846a73485934a79500ffe45508c05cb7f3359131bd850"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.842099 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" event={"ID":"46e24d96-40d2-40c8-83cf-cf9bcddb570a","Type":"ContainerStarted","Data":"bdb449ea01963bcdd046864279afc24ae036c935c568c8e58ee02d511917100e"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.842136 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" event={"ID":"46e24d96-40d2-40c8-83cf-cf9bcddb570a","Type":"ContainerStarted","Data":"08068c352e339f2942f62ca1301c82ff354e74ffd54fa977cd8b357e07d5b62d"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.842333 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.843579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" event={"ID":"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15","Type":"ContainerStarted","Data":"221e9ad91c25a7aa9d5d8599957f58cbc61250283b4e2577d5ba5eb476a7a236"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.843616 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" event={"ID":"7c3fce21-b4c5-4f8a-b648-7fa2a2d45f15","Type":"ContainerStarted","Data":"fa590ab58769d6b9fa2646f32eedc2070445384fcb5ae86392d161b0c9cac7e3"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.844224 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkjmp\" (UniqueName: \"kubernetes.io/projected/76f89a4f-bf24-4abe-a62b-af295dc6f208-kube-api-access-vkjmp\") pod \"cluster-image-registry-operator-dc59b4c8b-68c2j\" (UID: \"76f89a4f-bf24-4abe-a62b-af295dc6f208\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.844586 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.847090 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" event={"ID":"a6b7fabb-b00b-41d3-9a63-291959a7c157","Type":"ContainerStarted","Data":"95ebbe6960ce94fb419a18ec4e56b20396c22c2620b92c3044c958c085e32e32"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.847136 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" event={"ID":"a6b7fabb-b00b-41d3-9a63-291959a7c157","Type":"ContainerStarted","Data":"382890d83d9c88b6fcf222e7e4654c7d2f36944e05fa4555f633f7c5cc290b74"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.853703 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" event={"ID":"2914d4c6-5bae-4d02-aaf6-13556172e946","Type":"ContainerStarted","Data":"80f518f90dbfb72fb873162c0604cc050a08c294fc0d914915c9306b844eaedd"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.858687 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nc52r" event={"ID":"6c5f1da0-75e3-422b-b9e2-06f0bc29c720","Type":"ContainerStarted","Data":"f067858a6f4cb7245287cd9d057fdac45c0f9609dc14e945e4a42eadcfebbdec"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.858732 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nc52r" event={"ID":"6c5f1da0-75e3-422b-b9e2-06f0bc29c720","Type":"ContainerStarted","Data":"6783d8c5acfeac081b56cd61a9a5950f156d1354bba17e6af315e5db06f66c9a"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.863559 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" event={"ID":"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5","Type":"ContainerStarted","Data":"34a444d238c7d4800752a8130963d379a81f4b06ef9adb6b6720c2df54da5519"} Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.881010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgvp\" (UniqueName: \"kubernetes.io/projected/eceebe1d-88a4-4b8c-819d-75d63c04aeb4-kube-api-access-fpgvp\") pod \"etcd-operator-b45778765-2qtg7\" (UID: \"eceebe1d-88a4-4b8c-819d-75d63c04aeb4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.887494 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcsq\" (UniqueName: \"kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq\") pod \"console-f9d7485db-5lxbp\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.899284 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.924577 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5qf\" (UniqueName: \"kubernetes.io/projected/e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f-kube-api-access-cl5qf\") pod \"openshift-config-operator-7777fb866f-pqn2r\" (UID: \"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.927792 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpscn\" (UniqueName: \"kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn\") pod \"oauth-openshift-558db77b4-bmjcx\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.967525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88hkx\" (UniqueName: \"kubernetes.io/projected/af58c416-6966-498a-9e34-9cd879d3a21c-kube-api-access-88hkx\") pod \"openshift-controller-manager-operator-756b6f6bc6-2ncpt\" (UID: \"af58c416-6966-498a-9e34-9cd879d3a21c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.980876 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:55 crc kubenswrapper[4760]: I1204 12:15:55.984859 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2t4\" (UniqueName: \"kubernetes.io/projected/397d3069-2845-40f6-bbb9-d2541d0f3f80-kube-api-access-qj2t4\") pod \"downloads-7954f5f757-dn2jn\" (UID: \"397d3069-2845-40f6-bbb9-d2541d0f3f80\") " pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.008893 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65668b2d-24cd-4a5f-8b00-ed3778e134fd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8j4hs\" (UID: \"65668b2d-24cd-4a5f-8b00-ed3778e134fd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.030585 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6dt\" (UniqueName: \"kubernetes.io/projected/78d9eb51-de42-4536-a561-aee39bfd92f3-kube-api-access-8w6dt\") pod \"cluster-samples-operator-665b6dd947-lms2q\" (UID: \"78d9eb51-de42-4536-a561-aee39bfd92f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.049897 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n9qz\" (UniqueName: \"kubernetes.io/projected/760a4690-8674-4847-af35-49eec4903ef2-kube-api-access-5n9qz\") pod \"dns-operator-744455d44c-c7hmd\" (UID: \"760a4690-8674-4847-af35-49eec4903ef2\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.072016 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm9db\" (UniqueName: \"kubernetes.io/projected/4279cf63-15b8-49a2-ab5c-794bdbf0fda8-kube-api-access-hm9db\") pod \"ingress-operator-5b745b69d9-c8jng\" (UID: \"4279cf63-15b8-49a2-ab5c-794bdbf0fda8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.089105 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgd6\" (UniqueName: \"kubernetes.io/projected/94f4372f-7c56-4cf5-9384-ae555845f15a-kube-api-access-pdgd6\") pod \"openshift-apiserver-operator-796bbdcf4f-lf78k\" (UID: \"94f4372f-7c56-4cf5-9384-ae555845f15a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.133660 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.136506 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.139221 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.139330 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.140952 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.141705 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.141974 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.142563 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.148634 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.152323 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.162473 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.169626 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.173630 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.177504 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.187158 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.187683 4760 request.go:700] Waited for 1.922411998s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.190371 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.190667 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.208843 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.213259 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.230853 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.250010 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.272985 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.293035 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.305164 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78"] Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.452800 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.453022 4760 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.453147 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.456157 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.456598 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.456747 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.456893 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.488496 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.491558 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.502513 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j"] Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.520387 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.758926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759074 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmw5j\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759126 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759236 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759265 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759322 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759347 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.759376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: E1204 12:15:56.759979 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.259958874 +0000 UTC m=+160.301405621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.861918 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:56 crc kubenswrapper[4760]: E1204 12:15:56.862464 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.362411065 +0000 UTC m=+160.403857642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862571 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27prx\" (UniqueName: \"kubernetes.io/projected/2623f14b-9edc-48cd-aeba-08cc1155890f-kube-api-access-27prx\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgf7\" (UniqueName: \"kubernetes.io/projected/55feea70-8c8d-41ad-afd1-95f0ad24faf0-kube-api-access-8qgf7\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862716 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmw5j\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862807 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-plugins-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862884 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-metrics-certs\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.862925 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55feea70-8c8d-41ad-afd1-95f0ad24faf0-config-volume\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863063 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-key\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863254 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863297 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341ec2d-564b-46ec-a993-625f66f899c8-config\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863650 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgtt\" (UniqueName: \"kubernetes.io/projected/b865046e-88e6-4c7b-b67e-9d7342619f52-kube-api-access-dwgtt\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863704 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89ac52b9-3fe5-4f64-b128-f88467ed56d4-tmpfs\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.863755 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshpm\" (UniqueName: \"kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.864183 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da66d59c-5026-4b0f-bbda-c064af724a28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.864524 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hshh\" (UniqueName: \"kubernetes.io/projected/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-kube-api-access-2hshh\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.865383 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0dbe61-a854-4ab3-90af-4404e679cd68-service-ca-bundle\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.865424 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.865768 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a8ab6f-5954-48fc-bc24-738807b91ea4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.865929 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f95c4be-368d-4bf1-a109-ee96a5da7491-proxy-tls\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.865993 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da66d59c-5026-4b0f-bbda-c064af724a28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866023 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-socket-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866116 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866256 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866303 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b865046e-88e6-4c7b-b67e-9d7342619f52-proxy-tls\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866338 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-srv-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866399 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0341ec2d-564b-46ec-a993-625f66f899c8-serving-cert\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866437 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866470 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-config\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866506 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866537 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866633 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5pjv\" (UniqueName: \"kubernetes.io/projected/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-kube-api-access-r5pjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.866667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ng5\" (UniqueName: \"kubernetes.io/projected/8f95c4be-368d-4bf1-a109-ee96a5da7491-kube-api-access-s4ng5\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:56 crc kubenswrapper[4760]: E1204 12:15:56.867897 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.367865004 +0000 UTC m=+160.409311751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.869509 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.869540 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.873495 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.875383 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chs29\" (UniqueName: \"kubernetes.io/projected/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-kube-api-access-chs29\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.876115 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.876420 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.876504 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b865046e-88e6-4c7b-b67e-9d7342619f52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.876835 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvshd\" (UniqueName: \"kubernetes.io/projected/89ac52b9-3fe5-4f64-b128-f88467ed56d4-kube-api-access-qvshd\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.876920 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55feea70-8c8d-41ad-afd1-95f0ad24faf0-metrics-tls\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.877070 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-cabundle\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.877942 4760 generic.go:334] "Generic (PLEG): container finished" podID="33a1d2dc-800d-4409-a3a7-927f1dd0cfd5" containerID="84a1d1d46776a90f9a30d57d858116590fabd2efae6820720d002b41fd112d5a" exitCode=0 Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878108 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" event={"ID":"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5","Type":"ContainerDied","Data":"84a1d1d46776a90f9a30d57d858116590fabd2efae6820720d002b41fd112d5a"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-stats-auth\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878411 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-webhook-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-csi-data-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878481 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhh9\" (UniqueName: \"kubernetes.io/projected/0341ec2d-564b-46ec-a993-625f66f899c8-kube-api-access-zhhh9\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878569 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878597 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-mountpoint-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878730 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.878782 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-srv-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.880042 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.880095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-certs\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882142 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-node-bootstrap-token\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882290 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-registration-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882591 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882676 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.882822 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.883107 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9gls\" (UniqueName: \"kubernetes.io/projected/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-kube-api-access-l9gls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.883471 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da66d59c-5026-4b0f-bbda-c064af724a28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.883560 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzgj\" (UniqueName: \"kubernetes.io/projected/ca0dbe61-a854-4ab3-90af-4404e679cd68-kube-api-access-pfzgj\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.883606 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdz6n\" (UniqueName: \"kubernetes.io/projected/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-kube-api-access-sdz6n\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884050 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ml6t\" (UniqueName: \"kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-cert\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884165 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktts\" (UniqueName: \"kubernetes.io/projected/1d5570b3-b935-437b-a217-d42cd1394b53-kube-api-access-pktts\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884202 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rssvp\" (UniqueName: \"kubernetes.io/projected/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-kube-api-access-rssvp\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884255 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-default-certificate\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.884294 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.885762 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz26x\" (UniqueName: \"kubernetes.io/projected/96a8ab6f-5954-48fc-bc24-738807b91ea4-kube-api-access-tz26x\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.885804 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-images\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.885833 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqv5d\" (UniqueName: \"kubernetes.io/projected/645c4439-d2f2-44a5-a115-566f5c436729-kube-api-access-sqv5d\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.889509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" event={"ID":"d0f3af3b-1349-4450-8df4-9b3cd05e6d26","Type":"ContainerStarted","Data":"7ad9273d013a9ce0e57bc1ee7b193be926d3c65d6f176664e877bc8f214e8279"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.896588 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.909329 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" event={"ID":"60c1d932-093d-416b-9c58-88ff3d559656","Type":"ContainerStarted","Data":"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.910436 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.912843 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.912936 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" event={"ID":"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c","Type":"ContainerStarted","Data":"2904db82f05d20bccb2f536ef0fe4cdb41071d5b437ca5cfdc1bbdbbc2807c82"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.914530 4760 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7fpdd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.914572 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" podUID="60c1d932-093d-416b-9c58-88ff3d559656" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.914943 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmw5j\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.915230 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" event={"ID":"a6b7fabb-b00b-41d3-9a63-291959a7c157","Type":"ContainerStarted","Data":"22b16cba2b054b7631373fba32905e5d8f89175d9ad00264e73ab2959ae42233"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.924690 4760 generic.go:334] "Generic (PLEG): container finished" podID="2914d4c6-5bae-4d02-aaf6-13556172e946" containerID="f7ec294198d47b69debe532394d41007253553baf27bc55a5a2f172191581a0f" exitCode=0 Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.924833 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" event={"ID":"2914d4c6-5bae-4d02-aaf6-13556172e946","Type":"ContainerDied","Data":"f7ec294198d47b69debe532394d41007253553baf27bc55a5a2f172191581a0f"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.929104 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" event={"ID":"76f89a4f-bf24-4abe-a62b-af295dc6f208","Type":"ContainerStarted","Data":"7b31e50203c28207998131ef279281c13e541a4b0c6ac42d66c7f5f1286d803c"} Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.930076 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.939663 4760 patch_prober.go:28] interesting pod/console-operator-58897d9998-nc52r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.939765 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nc52r" podUID="6c5f1da0-75e3-422b-b9e2-06f0bc29c720" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.940390 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:56 crc kubenswrapper[4760]: I1204 12:15:56.946773 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.986884 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987069 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-webhook-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987093 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-csi-data-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987129 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-stats-auth\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhh9\" (UniqueName: \"kubernetes.io/projected/0341ec2d-564b-46ec-a993-625f66f899c8-kube-api-access-zhhh9\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987189 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987202 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-mountpoint-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987234 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-srv-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987292 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-certs\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987349 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-node-bootstrap-token\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987363 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-registration-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987425 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9gls\" (UniqueName: \"kubernetes.io/projected/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-kube-api-access-l9gls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987454 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da66d59c-5026-4b0f-bbda-c064af724a28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987479 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdz6n\" (UniqueName: \"kubernetes.io/projected/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-kube-api-access-sdz6n\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzgj\" (UniqueName: \"kubernetes.io/projected/ca0dbe61-a854-4ab3-90af-4404e679cd68-kube-api-access-pfzgj\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987538 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ml6t\" (UniqueName: \"kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-cert\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987593 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktts\" (UniqueName: \"kubernetes.io/projected/1d5570b3-b935-437b-a217-d42cd1394b53-kube-api-access-pktts\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987624 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rssvp\" (UniqueName: \"kubernetes.io/projected/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-kube-api-access-rssvp\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987651 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-default-certificate\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987680 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987731 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz26x\" (UniqueName: \"kubernetes.io/projected/96a8ab6f-5954-48fc-bc24-738807b91ea4-kube-api-access-tz26x\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987752 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-images\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987774 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqv5d\" (UniqueName: \"kubernetes.io/projected/645c4439-d2f2-44a5-a115-566f5c436729-kube-api-access-sqv5d\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987799 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987815 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27prx\" (UniqueName: \"kubernetes.io/projected/2623f14b-9edc-48cd-aeba-08cc1155890f-kube-api-access-27prx\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987849 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgf7\" (UniqueName: \"kubernetes.io/projected/55feea70-8c8d-41ad-afd1-95f0ad24faf0-kube-api-access-8qgf7\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987865 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-plugins-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987881 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-metrics-certs\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987907 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55feea70-8c8d-41ad-afd1-95f0ad24faf0-config-volume\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987933 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-key\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987958 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.987985 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341ec2d-564b-46ec-a993-625f66f899c8-config\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988007 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988022 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89ac52b9-3fe5-4f64-b128-f88467ed56d4-tmpfs\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988039 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jshpm\" (UniqueName: \"kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgtt\" (UniqueName: \"kubernetes.io/projected/b865046e-88e6-4c7b-b67e-9d7342619f52-kube-api-access-dwgtt\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988118 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da66d59c-5026-4b0f-bbda-c064af724a28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988143 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hshh\" (UniqueName: \"kubernetes.io/projected/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-kube-api-access-2hshh\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988181 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0dbe61-a854-4ab3-90af-4404e679cd68-service-ca-bundle\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988225 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a8ab6f-5954-48fc-bc24-738807b91ea4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988261 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f95c4be-368d-4bf1-a109-ee96a5da7491-proxy-tls\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988278 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da66d59c-5026-4b0f-bbda-c064af724a28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988293 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-socket-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b865046e-88e6-4c7b-b67e-9d7342619f52-proxy-tls\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988340 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-srv-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988358 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-config\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988375 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988392 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0341ec2d-564b-46ec-a993-625f66f899c8-serving-cert\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988409 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5pjv\" (UniqueName: \"kubernetes.io/projected/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-kube-api-access-r5pjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988466 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ng5\" (UniqueName: \"kubernetes.io/projected/8f95c4be-368d-4bf1-a109-ee96a5da7491-kube-api-access-s4ng5\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988526 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chs29\" (UniqueName: \"kubernetes.io/projected/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-kube-api-access-chs29\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988555 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988574 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b865046e-88e6-4c7b-b67e-9d7342619f52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988640 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvshd\" (UniqueName: \"kubernetes.io/projected/89ac52b9-3fe5-4f64-b128-f88467ed56d4-kube-api-access-qvshd\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988680 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-cabundle\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.988713 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55feea70-8c8d-41ad-afd1-95f0ad24faf0-metrics-tls\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:56.989239 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.489195183 +0000 UTC m=+160.530641760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:56.994022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-csi-data-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.012043 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55feea70-8c8d-41ad-afd1-95f0ad24faf0-config-volume\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.012817 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-mountpoint-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.014155 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.015313 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-registration-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.018144 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-socket-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.018907 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0341ec2d-564b-46ec-a993-625f66f899c8-config\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.022286 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-images\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.024292 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.025107 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b865046e-88e6-4c7b-b67e-9d7342619f52-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.026480 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f95c4be-368d-4bf1-a109-ee96a5da7491-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.032538 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.032769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.034285 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/89ac52b9-3fe5-4f64-b128-f88467ed56d4-tmpfs\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.035471 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-stats-auth\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.036997 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-key\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.037595 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2623f14b-9edc-48cd-aeba-08cc1155890f-plugins-dir\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.040180 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-config\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.042293 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da66d59c-5026-4b0f-bbda-c064af724a28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.043060 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.043369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ml6t\" (UniqueName: \"kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t\") pod \"marketplace-operator-79b997595-qvdzl\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.043690 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-signing-cabundle\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.045512 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-webhook-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.045846 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b865046e-88e6-4c7b-b67e-9d7342619f52-proxy-tls\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.046160 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca0dbe61-a854-4ab3-90af-4404e679cd68-service-ca-bundle\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.047734 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktts\" (UniqueName: \"kubernetes.io/projected/1d5570b3-b935-437b-a217-d42cd1394b53-kube-api-access-pktts\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.048782 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.053074 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz26x\" (UniqueName: \"kubernetes.io/projected/96a8ab6f-5954-48fc-bc24-738807b91ea4-kube-api-access-tz26x\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.054144 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/96a8ab6f-5954-48fc-bc24-738807b91ea4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-zpxt2\" (UID: \"96a8ab6f-5954-48fc-bc24-738807b91ea4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.056390 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da66d59c-5026-4b0f-bbda-c064af724a28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.057630 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-default-certificate\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.058951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f95c4be-368d-4bf1-a109-ee96a5da7491-proxy-tls\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.061914 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca0dbe61-a854-4ab3-90af-4404e679cd68-metrics-certs\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.062530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-srv-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.064397 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.123482 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.123581 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdz6n\" (UniqueName: \"kubernetes.io/projected/8f32fc56-11e1-419a-9b6f-1e6375b1b6a0-kube-api-access-sdz6n\") pod \"service-ca-9c57cc56f-nw7pr\" (UID: \"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0\") " pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.124538 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-certs\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.125678 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-cert\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.129920 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1d5570b3-b935-437b-a217-d42cd1394b53-node-bootstrap-token\") pod \"machine-config-server-mwp4x\" (UID: \"1d5570b3-b935-437b-a217-d42cd1394b53\") " pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.130397 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/55feea70-8c8d-41ad-afd1-95f0ad24faf0-metrics-tls\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.130834 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89ac52b9-3fe5-4f64-b128-f88467ed56d4-apiservice-cert\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.135174 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.135277 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-srv-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.135731 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.63571664 +0000 UTC m=+160.677163207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.136168 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rssvp\" (UniqueName: \"kubernetes.io/projected/5b6c409a-b9ca-4b23-8b08-a9598c1cf220-kube-api-access-rssvp\") pod \"ingress-canary-jlmjr\" (UID: \"5b6c409a-b9ca-4b23-8b08-a9598c1cf220\") " pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.136519 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0341ec2d-564b-46ec-a993-625f66f899c8-serving-cert\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.137058 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hshh\" (UniqueName: \"kubernetes.io/projected/1c03511a-3670-4fe4-929f-5ceaa64d0cb4-kube-api-access-2hshh\") pod \"olm-operator-6b444d44fb-8l6bv\" (UID: \"1c03511a-3670-4fe4-929f-5ceaa64d0cb4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.137746 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.143175 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.143706 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80cdd4b8-9eeb-42d5-ad43-441e1037f5c4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8jnmg\" (UID: \"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.148331 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhh9\" (UniqueName: \"kubernetes.io/projected/0341ec2d-564b-46ec-a993-625f66f899c8-kube-api-access-zhhh9\") pod \"service-ca-operator-777779d784-zqr7n\" (UID: \"0341ec2d-564b-46ec-a993-625f66f899c8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.150312 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.160942 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.163658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/645c4439-d2f2-44a5-a115-566f5c436729-profile-collector-cert\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.183483 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgf7\" (UniqueName: \"kubernetes.io/projected/55feea70-8c8d-41ad-afd1-95f0ad24faf0-kube-api-access-8qgf7\") pod \"dns-default-f9rr2\" (UID: \"55feea70-8c8d-41ad-afd1-95f0ad24faf0\") " pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.188257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqv5d\" (UniqueName: \"kubernetes.io/projected/645c4439-d2f2-44a5-a115-566f5c436729-kube-api-access-sqv5d\") pod \"catalog-operator-68c6474976-cs9z4\" (UID: \"645c4439-d2f2-44a5-a115-566f5c436729\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.193700 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chs29\" (UniqueName: \"kubernetes.io/projected/b8b39e58-fac2-4827-8bee-b3e6e11c1a43-kube-api-access-chs29\") pod \"package-server-manager-789f6589d5-nqzbp\" (UID: \"b8b39e58-fac2-4827-8bee-b3e6e11c1a43\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.200718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.210710 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27prx\" (UniqueName: \"kubernetes.io/projected/2623f14b-9edc-48cd-aeba-08cc1155890f-kube-api-access-27prx\") pod \"csi-hostpathplugin-58wzk\" (UID: \"2623f14b-9edc-48cd-aeba-08cc1155890f\") " pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.212648 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshpm\" (UniqueName: \"kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm\") pod \"collect-profiles-29414175-9zcnx\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.214338 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgtt\" (UniqueName: \"kubernetes.io/projected/b865046e-88e6-4c7b-b67e-9d7342619f52-kube-api-access-dwgtt\") pod \"machine-config-controller-84d6567774-6brwl\" (UID: \"b865046e-88e6-4c7b-b67e-9d7342619f52\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.215719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvshd\" (UniqueName: \"kubernetes.io/projected/89ac52b9-3fe5-4f64-b128-f88467ed56d4-kube-api-access-qvshd\") pod \"packageserver-d55dfcdfc-jw2zk\" (UID: \"89ac52b9-3fe5-4f64-b128-f88467ed56d4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.217680 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.225275 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.236965 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.237576 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.73755316 +0000 UTC m=+160.778999727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.245093 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jlmjr" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.257856 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/da66d59c-5026-4b0f-bbda-c064af724a28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8qssb\" (UID: \"da66d59c-5026-4b0f-bbda-c064af724a28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.258301 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.270322 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.277493 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mwp4x" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.290822 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ng5\" (UniqueName: \"kubernetes.io/projected/8f95c4be-368d-4bf1-a109-ee96a5da7491-kube-api-access-s4ng5\") pod \"machine-config-operator-74547568cd-q2tmn\" (UID: \"8f95c4be-368d-4bf1-a109-ee96a5da7491\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.303355 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzgj\" (UniqueName: \"kubernetes.io/projected/ca0dbe61-a854-4ab3-90af-4404e679cd68-kube-api-access-pfzgj\") pod \"router-default-5444994796-42rhb\" (UID: \"ca0dbe61-a854-4ab3-90af-4404e679cd68\") " pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.303610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5pjv\" (UniqueName: \"kubernetes.io/projected/2dd314b5-2af3-4c1f-9ee9-406c48faaf78-kube-api-access-r5pjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8br2p\" (UID: \"2dd314b5-2af3-4c1f-9ee9-406c48faaf78\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.328291 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9rr2" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.344156 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.344991 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.844965973 +0000 UTC m=+160.886412540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.385927 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9gls\" (UniqueName: \"kubernetes.io/projected/fdbd7bc3-cca1-4368-814a-126ba13a4f8e-kube-api-access-l9gls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nwwhw\" (UID: \"fdbd7bc3-cca1-4368-814a-126ba13a4f8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.417484 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.418104 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.428235 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.440408 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.445837 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.446295 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:57.946275666 +0000 UTC m=+160.987722233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.446316 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.454633 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.473725 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.479605 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.487328 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.513731 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.533341 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.547486 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.547984 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.047966992 +0000 UTC m=+161.089413549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.548060 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7hmd"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.559865 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dn2jn"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.650402 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.650973 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.150937169 +0000 UTC m=+161.192383736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: W1204 12:15:57.659187 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397d3069_2845_40f6_bbb9_d2541d0f3f80.slice/crio-719c87f8bd49c66ac9d647b9785beb479cf326b5f7462d84fb4c15a67b398ec3 WatchSource:0}: Error finding container 719c87f8bd49c66ac9d647b9785beb479cf326b5f7462d84fb4c15a67b398ec3: Status 404 returned error can't find the container with id 719c87f8bd49c66ac9d647b9785beb479cf326b5f7462d84fb4c15a67b398ec3 Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.731155 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nc52r" podStartSLOduration=134.73113822 podStartE2EDuration="2m14.73113822s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:57.730990515 +0000 UTC m=+160.772437082" watchObservedRunningTime="2025-12-04 12:15:57.73113822 +0000 UTC m=+160.772584787" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.731926 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghltq" podStartSLOduration=133.731918535 podStartE2EDuration="2m13.731918535s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:57.671248776 +0000 UTC m=+160.712695353" watchObservedRunningTime="2025-12-04 12:15:57.731918535 +0000 UTC m=+160.773365102" Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.761261 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.761632 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.261618609 +0000 UTC m=+161.303065186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.822262 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.822332 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.822351 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2qtg7"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.836108 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.848614 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmjcx"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.848878 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nw7pr"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.866075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.866312 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.366280363 +0000 UTC m=+161.407726930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.866596 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.867071 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.367055568 +0000 UTC m=+161.408502135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.888905 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.920131 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.965159 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:15:57 crc kubenswrapper[4760]: I1204 12:15:57.972925 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:57 crc kubenswrapper[4760]: E1204 12:15:57.973548 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.473527941 +0000 UTC m=+161.514974508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:57 crc kubenswrapper[4760]: W1204 12:15:57.977347 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a04888_17e2_4e76_aa2a_6d1fe1ccfd0f.slice/crio-88a06f7ab13ad8d54e9adef41e0f3f29cc9f3cf612968c67051e358353724ebd WatchSource:0}: Error finding container 88a06f7ab13ad8d54e9adef41e0f3f29cc9f3cf612968c67051e358353724ebd: Status 404 returned error can't find the container with id 88a06f7ab13ad8d54e9adef41e0f3f29cc9f3cf612968c67051e358353724ebd Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.092823 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-42rhb" event={"ID":"ca0dbe61-a854-4ab3-90af-4404e679cd68","Type":"ContainerStarted","Data":"8d345327f081d692bdbdf3371a8ffe927dc954e0f9fa3948578807acbda889ac"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.104545 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.105023 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.605007073 +0000 UTC m=+161.646453640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.148441 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" event={"ID":"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c","Type":"ContainerStarted","Data":"fca62eead97a68d2df0dc1816ef470102cdc28de4fa6f350bc841200b7bf21cf"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.153063 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" podStartSLOduration=134.153035799 podStartE2EDuration="2m14.153035799s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:58.148867032 +0000 UTC m=+161.190313599" watchObservedRunningTime="2025-12-04 12:15:58.153035799 +0000 UTC m=+161.194482366" Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.180559 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" event={"ID":"76f89a4f-bf24-4abe-a62b-af295dc6f208","Type":"ContainerStarted","Data":"4d44e3ed0de9bf1ec40ff676781f30b17ad365ff09054a6f52e548d43fdd884e"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.186963 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" event={"ID":"78d9eb51-de42-4536-a561-aee39bfd92f3","Type":"ContainerStarted","Data":"adcfc04208743e92347fb2be4271bf36d87d5e99ec8ffa18d3cbcb1c6492c379"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.210023 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.210480 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.710460252 +0000 UTC m=+161.751906819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.211164 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" event={"ID":"760a4690-8674-4847-af35-49eec4903ef2","Type":"ContainerStarted","Data":"aa6cc4a8a9fc7742b32d288609325dcb6a6e4c38fe4c432c489cffb1889b8c77"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.223688 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerStarted","Data":"719c87f8bd49c66ac9d647b9785beb479cf326b5f7462d84fb4c15a67b398ec3"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.261651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" event={"ID":"33a1d2dc-800d-4409-a3a7-927f1dd0cfd5","Type":"ContainerStarted","Data":"912ef33f75f97db32610d72d7017a2b92357c1e62d59de96ddaf9789c8088c4f"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.277562 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mwp4x" event={"ID":"1d5570b3-b935-437b-a217-d42cd1394b53","Type":"ContainerStarted","Data":"8b9ac3b6cdff5417dcc893367f40fcfa0c4c72ff201e9cbb3ce5b96cb32a1d66"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.280421 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fdnjr" podStartSLOduration=135.280407816 podStartE2EDuration="2m15.280407816s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:58.278546886 +0000 UTC m=+161.319993453" watchObservedRunningTime="2025-12-04 12:15:58.280407816 +0000 UTC m=+161.321854383" Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.289170 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" event={"ID":"af58c416-6966-498a-9e34-9cd879d3a21c","Type":"ContainerStarted","Data":"3e380d4bd73741ffdaa3326db24a98adf64ece81d23252fc2bca7a24ba057285"} Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.311964 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.313984 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.813969397 +0000 UTC m=+161.855415964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.415875 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.416613 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:58.916597503 +0000 UTC m=+161.958044070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.520646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.523396 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.023372206 +0000 UTC m=+162.064818773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.633462 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.633741 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.133724156 +0000 UTC m=+162.175170723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.674251 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cn9kc" podStartSLOduration=135.674229404 podStartE2EDuration="2m15.674229404s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:58.673726898 +0000 UTC m=+161.715173465" watchObservedRunningTime="2025-12-04 12:15:58.674229404 +0000 UTC m=+161.715675971" Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.745543 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.745835 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.245823223 +0000 UTC m=+162.287269790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.762499 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.879963 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.880479 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.380453589 +0000 UTC m=+162.421900156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.913591 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nc52r" Dec 04 12:15:58 crc kubenswrapper[4760]: I1204 12:15:58.982928 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:58 crc kubenswrapper[4760]: E1204 12:15:58.983438 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.483414466 +0000 UTC m=+162.524861033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.084064 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.084608 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.584590094 +0000 UTC m=+162.626036661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.151726 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" podStartSLOduration=136.151695636 podStartE2EDuration="2m16.151695636s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:59.122624152 +0000 UTC m=+162.164070719" watchObservedRunningTime="2025-12-04 12:15:59.151695636 +0000 UTC m=+162.193142203" Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.186799 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.187341 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.687320714 +0000 UTC m=+162.728767291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.193696 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" podStartSLOduration=135.193672412 podStartE2EDuration="2m15.193672412s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:59.193534208 +0000 UTC m=+162.234980785" watchObservedRunningTime="2025-12-04 12:15:59.193672412 +0000 UTC m=+162.235118979" Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.274542 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-68c2j" podStartSLOduration=135.274522044 podStartE2EDuration="2m15.274522044s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:15:59.225445275 +0000 UTC m=+162.266891842" watchObservedRunningTime="2025-12-04 12:15:59.274522044 +0000 UTC m=+162.315968621" Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.287372 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.287651 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.787632234 +0000 UTC m=+162.829078801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.390361 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.390768 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.890749977 +0000 UTC m=+162.932196544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.444767 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" event={"ID":"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0","Type":"ContainerStarted","Data":"441c2348a37eedf0bc1c62ac212f9e3d0bd9d2f35537d7439c762997e5a3e996"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.445601 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" event={"ID":"4279cf63-15b8-49a2-ab5c-794bdbf0fda8","Type":"ContainerStarted","Data":"fd3b6fc3143db0de8a83aeb7aab508f731b3f1b3818a605de28f90ace55a08ba"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.446942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" event={"ID":"65668b2d-24cd-4a5f-8b00-ed3778e134fd","Type":"ContainerStarted","Data":"80d493d0f3129f4f2834a07e6b34c29fd0232e14a049506669724c279172a05b"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.448003 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" event={"ID":"eceebe1d-88a4-4b8c-819d-75d63c04aeb4","Type":"ContainerStarted","Data":"ff49592fb394e2b85082d4325b9d9da4a2cac530df2a3c9330758912238061d7"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.458645 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" event={"ID":"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f","Type":"ContainerStarted","Data":"c7551c6b3868f39641bc094fb5a4b2311ac6081d9d6dae2feb8ee268fe262039"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.458711 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" event={"ID":"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f","Type":"ContainerStarted","Data":"88a06f7ab13ad8d54e9adef41e0f3f29cc9f3cf612968c67051e358353724ebd"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.463812 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5lxbp" event={"ID":"8c791b99-3020-4b20-9d91-87c5ba9f615a","Type":"ContainerStarted","Data":"53643b3b518b2595fe5f8491b9dafc6af6f8618c6efa1202dfc1618c1059b12e"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.480071 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerStarted","Data":"49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.480670 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.483145 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.483227 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.493472 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.494528 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:15:59.99451121 +0000 UTC m=+163.035957777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.504111 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" event={"ID":"55d5b8e1-f975-4db6-a2e5-4d5f40dff81c","Type":"ContainerStarted","Data":"5a5752c4d08c4a8a1b826d02fdc1fae4bc735affafc53039ee11747a84f17ce4"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.505806 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" event={"ID":"2914d4c6-5bae-4d02-aaf6-13556172e946","Type":"ContainerStarted","Data":"2b5e38ad566c584b439f57ecf54d63173d0181dfb4444a2d4a56069e81d9cb94"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.506573 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" event={"ID":"87218323-b321-4cd8-8da1-5fa8769eb3b0","Type":"ContainerStarted","Data":"06b3fa21c4a50b028ef1278937efcebe92e6883edd59d72cb1df2bf83343d720"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.507421 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" event={"ID":"94f4372f-7c56-4cf5-9384-ae555845f15a","Type":"ContainerStarted","Data":"3740f771bce42032f5df01882e7affcd601b884450ec3a19b0c2de74c1265d6c"} Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.595927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.596324 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.096311359 +0000 UTC m=+163.137757926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.709000 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.709338 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.209308366 +0000 UTC m=+163.250754933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.709639 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.714552 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.214538677 +0000 UTC m=+163.255985314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.812632 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.813248 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.313227875 +0000 UTC m=+163.354674442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:15:59 crc kubenswrapper[4760]: I1204 12:15:59.917486 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:15:59 crc kubenswrapper[4760]: E1204 12:15:59.917933 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.417919878 +0000 UTC m=+163.459366445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.018579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.019654 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.519628535 +0000 UTC m=+163.561075102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.048481 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.048942 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.120760 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.121242 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.621190996 +0000 UTC m=+163.662637573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.231968 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.232552 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.732534778 +0000 UTC m=+163.773981345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.232689 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.233056 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.733047845 +0000 UTC m=+163.774494412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.282698 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dn2jn" podStartSLOduration=137.282678633 podStartE2EDuration="2m17.282678633s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.232231628 +0000 UTC m=+163.273678195" watchObservedRunningTime="2025-12-04 12:16:00.282678633 +0000 UTC m=+163.324125200" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.335070 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.335543 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.835510146 +0000 UTC m=+163.876956703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.457241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.457683 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:00.957666742 +0000 UTC m=+163.999113309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.517846 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" event={"ID":"78d9eb51-de42-4536-a561-aee39bfd92f3","Type":"ContainerStarted","Data":"4131ac59c4c6bb1d76be56f7a852b940f308d04aca22050fb06a930527f6d010"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.519149 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" event={"ID":"94f4372f-7c56-4cf5-9384-ae555845f15a","Type":"ContainerStarted","Data":"512724ed22a110a89a6f97ea91f2f8ee4e802d51acedb9ade120f12426701029"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.528034 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" event={"ID":"760a4690-8674-4847-af35-49eec4903ef2","Type":"ContainerStarted","Data":"f777a0c869a5f52087dad9dd9dcbc42a92d76abe8e7a51a8ab77129183edaa81"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.531091 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mwp4x" event={"ID":"1d5570b3-b935-437b-a217-d42cd1394b53","Type":"ContainerStarted","Data":"9731dee1bb3a15870a8241cc6f31f6294584d5b04eec61465ea68a0fbd94174d"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.544113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" event={"ID":"af58c416-6966-498a-9e34-9cd879d3a21c","Type":"ContainerStarted","Data":"ad77190a55d85842182712e3faf02595f71bcea135e62347ae234f5a0cee75e3"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.559256 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.559762 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.059741331 +0000 UTC m=+164.101187898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.567903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-42rhb" event={"ID":"ca0dbe61-a854-4ab3-90af-4404e679cd68","Type":"ContainerStarted","Data":"e698b104fc9cd50a345f8e0ebb161d7d1607951d719f826a55562051b5682601"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.574679 4760 generic.go:334] "Generic (PLEG): container finished" podID="e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f" containerID="c7551c6b3868f39641bc094fb5a4b2311ac6081d9d6dae2feb8ee268fe262039" exitCode=0 Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.575288 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lf78k" podStartSLOduration=137.57526893 podStartE2EDuration="2m17.57526893s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.572672174 +0000 UTC m=+163.614118751" watchObservedRunningTime="2025-12-04 12:16:00.57526893 +0000 UTC m=+163.616715497" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.576495 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nwh78" podStartSLOduration=136.57648709 podStartE2EDuration="2m16.57648709s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.331521764 +0000 UTC m=+163.372968351" watchObservedRunningTime="2025-12-04 12:16:00.57648709 +0000 UTC m=+163.617933657" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.577543 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" event={"ID":"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f","Type":"ContainerDied","Data":"c7551c6b3868f39641bc094fb5a4b2311ac6081d9d6dae2feb8ee268fe262039"} Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.593433 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.593850 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.596713 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.635314 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mwp4x" podStartSLOduration=6.635287679 podStartE2EDuration="6.635287679s" podCreationTimestamp="2025-12-04 12:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.635090572 +0000 UTC m=+163.676537139" watchObservedRunningTime="2025-12-04 12:16:00.635287679 +0000 UTC m=+163.676734246" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.662370 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.664556 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.164535349 +0000 UTC m=+164.205981916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.687593 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2ncpt" podStartSLOduration=137.687575293 podStartE2EDuration="2m17.687575293s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.687125719 +0000 UTC m=+163.728572286" watchObservedRunningTime="2025-12-04 12:16:00.687575293 +0000 UTC m=+163.729021860" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.752096 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jlmjr"] Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.762482 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-42rhb" podStartSLOduration=136.76245082 podStartE2EDuration="2m16.76245082s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:00.753712653 +0000 UTC m=+163.795159220" watchObservedRunningTime="2025-12-04 12:16:00.76245082 +0000 UTC m=+163.803897397" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.763702 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.765899 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.265877952 +0000 UTC m=+164.307324519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.836516 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv"] Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.867814 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.868293 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.368271881 +0000 UTC m=+164.409718448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.868560 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.963994 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg"] Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.968965 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:00 crc kubenswrapper[4760]: E1204 12:16:00.969041 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.469022726 +0000 UTC m=+164.510469293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:00 crc kubenswrapper[4760]: I1204 12:16:00.999016 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:00.999983 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.4999571 +0000 UTC m=+164.541403667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.108388 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.108905 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.608877523 +0000 UTC m=+164.650324090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.157616 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9rr2"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.183435 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.210633 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.211159 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.711141927 +0000 UTC m=+164.752588484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: W1204 12:16:01.231817 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55feea70_8c8d_41ad_afd1_95f0ad24faf0.slice/crio-b60f1ba368cdbad6e25e5b71203b9a198b1887e86c1b766e3fa7e2daaa27fcf3 WatchSource:0}: Error finding container b60f1ba368cdbad6e25e5b71203b9a198b1887e86c1b766e3fa7e2daaa27fcf3: Status 404 returned error can't find the container with id b60f1ba368cdbad6e25e5b71203b9a198b1887e86c1b766e3fa7e2daaa27fcf3 Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.311728 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.312050 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.812005335 +0000 UTC m=+164.853451902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.312292 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.313026 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.813010689 +0000 UTC m=+164.854457256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.326898 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-58wzk"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.374837 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.390352 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.416796 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.417282 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:01.917257238 +0000 UTC m=+164.958703805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.451289 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.467991 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:01 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:01 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:01 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.468050 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.471390 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.473137 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn"] Dec 04 12:16:01 crc kubenswrapper[4760]: W1204 12:16:01.479661 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ac52b9_3fe5_4f64_b128_f88467ed56d4.slice/crio-83d6c48684e3a6adda3c9d3528c70e0a89eca58014d2cdd02e8bc1251d0d5049 WatchSource:0}: Error finding container 83d6c48684e3a6adda3c9d3528c70e0a89eca58014d2cdd02e8bc1251d0d5049: Status 404 returned error can't find the container with id 83d6c48684e3a6adda3c9d3528c70e0a89eca58014d2cdd02e8bc1251d0d5049 Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.511101 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.525370 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.525884 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.02586414 +0000 UTC m=+165.067310707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.529559 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-zpxt2"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.544219 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.544267 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.613981 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.629616 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.629757 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.129732158 +0000 UTC m=+165.171178725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.630178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.630588 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.130576735 +0000 UTC m=+165.172023302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.649578 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" event={"ID":"65668b2d-24cd-4a5f-8b00-ed3778e134fd","Type":"ContainerStarted","Data":"6097f29e7fba04d5bb24df73b62c36a85da3e7213b439b2f1843e3a6ce985342"} Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.656763 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb"] Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.677021 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9rr2" event={"ID":"55feea70-8c8d-41ad-afd1-95f0ad24faf0","Type":"ContainerStarted","Data":"b60f1ba368cdbad6e25e5b71203b9a198b1887e86c1b766e3fa7e2daaa27fcf3"} Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.706868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jlmjr" event={"ID":"5b6c409a-b9ca-4b23-8b08-a9598c1cf220","Type":"ContainerStarted","Data":"47d987ea850cbfcdcb7fbf63eef08d4be0ccd79a2c810793a588a4dc9b50bab2"} Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.706933 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jlmjr" event={"ID":"5b6c409a-b9ca-4b23-8b08-a9598c1cf220","Type":"ContainerStarted","Data":"62da82d23e908d96492dcbcd63d8e478eead785d53f131fcf8d71bbc7a8b3f87"} Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.731170 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.733023 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.232979934 +0000 UTC m=+165.274426511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.793262 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8j4hs" podStartSLOduration=137.7932392 podStartE2EDuration="2m17.7932392s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:01.740844622 +0000 UTC m=+164.782291189" watchObservedRunningTime="2025-12-04 12:16:01.7932392 +0000 UTC m=+164.834685767" Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.793891 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jlmjr" podStartSLOduration=8.793885842 podStartE2EDuration="8.793885842s" podCreationTimestamp="2025-12-04 12:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:01.790816011 +0000 UTC m=+164.832262578" watchObservedRunningTime="2025-12-04 12:16:01.793885842 +0000 UTC m=+164.835332409" Dec 04 12:16:01 crc kubenswrapper[4760]: W1204 12:16:01.807396 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdbd7bc3_cca1_4368_814a_126ba13a4f8e.slice/crio-faa1cd79b41af4f89c1219691ce112340b668842b4babfa51ff32c9a0f75dfaf WatchSource:0}: Error finding container faa1cd79b41af4f89c1219691ce112340b668842b4babfa51ff32c9a0f75dfaf: Status 404 returned error can't find the container with id faa1cd79b41af4f89c1219691ce112340b668842b4babfa51ff32c9a0f75dfaf Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.843060 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" event={"ID":"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0","Type":"ContainerStarted","Data":"ec7e0605097284c5fedb4eaac78e880df0ace483ff13ff8e81a6ca575103ec2e"} Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.846582 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.847077 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.347057166 +0000 UTC m=+165.388503733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.947924 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:01 crc kubenswrapper[4760]: E1204 12:16:01.948374 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.448340278 +0000 UTC m=+165.489786845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.949324 4760 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qvdzl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 04 12:16:01 crc kubenswrapper[4760]: I1204 12:16:01.949416 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.018749 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" podStartSLOduration=138.018705416 podStartE2EDuration="2m18.018705416s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:01.987498962 +0000 UTC m=+165.028945519" watchObservedRunningTime="2025-12-04 12:16:02.018705416 +0000 UTC m=+165.060151983" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.020159 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" podStartSLOduration=139.020150433 podStartE2EDuration="2m19.020150433s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.018013583 +0000 UTC m=+165.059460150" watchObservedRunningTime="2025-12-04 12:16:02.020150433 +0000 UTC m=+165.061596990" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.063419 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" podStartSLOduration=138.063390512 podStartE2EDuration="2m18.063390512s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.062575545 +0000 UTC m=+165.104022112" watchObservedRunningTime="2025-12-04 12:16:02.063390512 +0000 UTC m=+165.104837079" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.064065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.067772 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.567739824 +0000 UTC m=+165.609186381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.173626 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.181361 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" podStartSLOduration=139.18133125 podStartE2EDuration="2m19.18133125s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.17674929 +0000 UTC m=+165.218195847" watchObservedRunningTime="2025-12-04 12:16:02.18133125 +0000 UTC m=+165.222777847" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.185313 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.685280059 +0000 UTC m=+165.726726636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.187775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.193106 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.693068346 +0000 UTC m=+165.734514913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.232376 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" podStartSLOduration=139.232336614 podStartE2EDuration="2m19.232336614s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.229880572 +0000 UTC m=+165.271327139" watchObservedRunningTime="2025-12-04 12:16:02.232336614 +0000 UTC m=+165.273783181" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.235879 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" event={"ID":"4279cf63-15b8-49a2-ab5c-794bdbf0fda8","Type":"ContainerStarted","Data":"69c617a41eb278d8fc4ec82cf3a3ccfa4c7ec800a692d876325af4cba0fcaa24"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-c8jng" event={"ID":"4279cf63-15b8-49a2-ab5c-794bdbf0fda8","Type":"ContainerStarted","Data":"b91e2364026b61de9d915b0f6a5f60e7ed550524c1862989ec7204473c4e534e"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236032 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236092 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236109 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2qtg7" event={"ID":"eceebe1d-88a4-4b8c-819d-75d63c04aeb4","Type":"ContainerStarted","Data":"73625faa30abe22db6f863dd0a9ed531055b901932969a0c4a566873e1aaeab8"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236128 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerStarted","Data":"d1417a77a6420bb4709d113547dfa6fd8bcfa406b64291604746a1c6221f68c2"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236178 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerStarted","Data":"04c3536daae0c404bb35f62d677eba808a56118428e89deb7d9d86062b6a41aa"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" event={"ID":"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4","Type":"ContainerStarted","Data":"a8bfd8bd62968fabbb0a444131f404fc051d1c40ef80f6b8c23b0694862f973e"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236248 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" event={"ID":"e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f","Type":"ContainerStarted","Data":"82be13cbb7f6425078cd699f0d328a0a4cf55457c6df622eba38dcefc1aa01d6"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236273 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7hmd" event={"ID":"760a4690-8674-4847-af35-49eec4903ef2","Type":"ContainerStarted","Data":"947bf0f44be4b0418049dfffb8968e597814328ab0e514a60f4ae43ef723dedb"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236292 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" event={"ID":"2623f14b-9edc-48cd-aeba-08cc1155890f","Type":"ContainerStarted","Data":"9153cbb3f83a9e654bd6719f89004169053b2df3728b9f88a5ba29da832f8d90"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.236318 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5lxbp" event={"ID":"8c791b99-3020-4b20-9d91-87c5ba9f615a","Type":"ContainerStarted","Data":"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.279455 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" event={"ID":"2914d4c6-5bae-4d02-aaf6-13556172e946","Type":"ContainerStarted","Data":"94ad79d6da7248f4c2a6c61f81f5d90781bf71afdcd8e51739c1ac1b65cab944"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.291565 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5lxbp" podStartSLOduration=139.291544525 podStartE2EDuration="2m19.291544525s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.285811747 +0000 UTC m=+165.327258324" watchObservedRunningTime="2025-12-04 12:16:02.291544525 +0000 UTC m=+165.332991092" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.303917 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" event={"ID":"78d9eb51-de42-4536-a561-aee39bfd92f3","Type":"ContainerStarted","Data":"2c4655ce2125bddb69d16727a39d4827c61aaf50f5602aca18de7504ca4a30a2"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.310552 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.310706 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.810678763 +0000 UTC m=+165.852125340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.311670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.313054 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.813036501 +0000 UTC m=+165.854483068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.324757 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" podStartSLOduration=139.324736413 podStartE2EDuration="2m19.324736413s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.322571733 +0000 UTC m=+165.364018310" watchObservedRunningTime="2025-12-04 12:16:02.324736413 +0000 UTC m=+165.366182980" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.330173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" event={"ID":"89ac52b9-3fe5-4f64-b128-f88467ed56d4","Type":"ContainerStarted","Data":"83d6c48684e3a6adda3c9d3528c70e0a89eca58014d2cdd02e8bc1251d0d5049"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.346743 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" event={"ID":"1c03511a-3670-4fe4-929f-5ceaa64d0cb4","Type":"ContainerStarted","Data":"5bc01c488f7a22e47ab13563b7948229939b82602ecb2d6f8e705dc1be4a761e"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.346834 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" event={"ID":"1c03511a-3670-4fe4-929f-5ceaa64d0cb4","Type":"ContainerStarted","Data":"cefe523eea751ec6ffd57117bdfc7dd671998fb55fd21b7068b26deb6df7be31"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.348326 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.355513 4760 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8l6bv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.355610 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" podUID="1c03511a-3670-4fe4-929f-5ceaa64d0cb4" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.359832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" event={"ID":"8f32fc56-11e1-419a-9b6f-1e6375b1b6a0","Type":"ContainerStarted","Data":"9bc15a95c312d96a5b32a8579453efd0214c8817830f2c703418d8cc44f50dcf"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.412568 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.413920 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:02.913897919 +0000 UTC m=+165.955344486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.434817 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" event={"ID":"87218323-b321-4cd8-8da1-5fa8769eb3b0","Type":"ContainerStarted","Data":"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820"} Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.441029 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-lms2q" podStartSLOduration=139.440995637 podStartE2EDuration="2m19.440995637s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.440890294 +0000 UTC m=+165.482336871" watchObservedRunningTime="2025-12-04 12:16:02.440995637 +0000 UTC m=+165.482442204" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.457039 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:02 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:02 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:02 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.457103 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.470357 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nw7pr" podStartSLOduration=138.470337809 podStartE2EDuration="2m18.470337809s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.467476696 +0000 UTC m=+165.508923263" watchObservedRunningTime="2025-12-04 12:16:02.470337809 +0000 UTC m=+165.511784376" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.488202 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2qbls" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.517888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.518406 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.018391826 +0000 UTC m=+166.059838393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.572604 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" podStartSLOduration=138.572585794 podStartE2EDuration="2m18.572585794s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.541899787 +0000 UTC m=+165.583346354" watchObservedRunningTime="2025-12-04 12:16:02.572585794 +0000 UTC m=+165.614032361" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.622989 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.625319 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.125277952 +0000 UTC m=+166.166724689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.653632 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" podStartSLOduration=139.653590361 podStartE2EDuration="2m19.653590361s" podCreationTimestamp="2025-12-04 12:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:02.626153201 +0000 UTC m=+165.667599768" watchObservedRunningTime="2025-12-04 12:16:02.653590361 +0000 UTC m=+165.695036938" Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.726511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.730321 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.230297377 +0000 UTC m=+166.271743944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.836561 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.837058 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.337038488 +0000 UTC m=+166.378485065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:02 crc kubenswrapper[4760]: I1204 12:16:02.938178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:02 crc kubenswrapper[4760]: E1204 12:16:02.939044 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.439027593 +0000 UTC m=+166.480474160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.042068 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.042550 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.542523858 +0000 UTC m=+166.583970415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.042950 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.043499 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.54348974 +0000 UTC m=+166.584936307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.144143 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.144667 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.644636168 +0000 UTC m=+166.686082735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.247496 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.248122 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.748092061 +0000 UTC m=+166.789538628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.352079 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.353321 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.853290432 +0000 UTC m=+166.894736989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.391051 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.391146 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.485980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.487193 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:03.987172273 +0000 UTC m=+167.028618840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.537038 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:03 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:03 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:03 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.537560 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.587804 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.588183 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.088168475 +0000 UTC m=+167.129615042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.726649 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.727157 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.227136615 +0000 UTC m=+167.268583182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:03 crc kubenswrapper[4760]: I1204 12:16:03.913956 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:03 crc kubenswrapper[4760]: E1204 12:16:03.914578 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.414540301 +0000 UTC m=+167.455986858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.039267 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.039862 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.539836441 +0000 UTC m=+167.581283008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.087221 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" event={"ID":"2dd314b5-2af3-4c1f-9ee9-406c48faaf78","Type":"ContainerStarted","Data":"aa9e6022c34767dbff1db393af4229267a25c28395545cf1a761bb2d3a289a63"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.087478 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" event={"ID":"2dd314b5-2af3-4c1f-9ee9-406c48faaf78","Type":"ContainerStarted","Data":"17eba5e37ddcc130aac46ae3be831f9a7bb1db7b409f29b29466758ef2d678a1"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.134294 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" event={"ID":"89ac52b9-3fe5-4f64-b128-f88467ed56d4","Type":"ContainerStarted","Data":"409f3c4289a3e1d0714924781a9257cbd6e2b4a5f691c07ca8fd308eb4ba2a66"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.136374 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.165046 4760 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jw2zk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.166155 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" podUID="89ac52b9-3fe5-4f64-b128-f88467ed56d4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.174719 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.178471 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.678416566 +0000 UTC m=+167.719863143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.192719 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" event={"ID":"96a8ab6f-5954-48fc-bc24-738807b91ea4","Type":"ContainerStarted","Data":"af5a0f126a7fb8d4c94e280b6509b15bff543feb2243771ce452ff68acf79e9c"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.197760 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" event={"ID":"96a8ab6f-5954-48fc-bc24-738807b91ea4","Type":"ContainerStarted","Data":"1d88577234723018ee3e49646db333fcb7a75c27633d6ed83edc91be057684a8"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.197802 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.200282 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.700256523 +0000 UTC m=+167.741703090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.256589 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9rr2" event={"ID":"55feea70-8c8d-41ad-afd1-95f0ad24faf0","Type":"ContainerStarted","Data":"7986e03d089422b44124d1c582a720044157fd20b5968025de51dc1554f13937"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.288804 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8br2p" podStartSLOduration=140.288776386 podStartE2EDuration="2m20.288776386s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:04.253017283 +0000 UTC m=+167.294463850" watchObservedRunningTime="2025-12-04 12:16:04.288776386 +0000 UTC m=+167.330222953" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.300826 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.301408 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.801378139 +0000 UTC m=+167.842824706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.308043 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" event={"ID":"0341ec2d-564b-46ec-a993-625f66f899c8","Type":"ContainerStarted","Data":"210e7686a52f88bbb951986f972c864f463da2a8c937286ce746cec8e76f342b"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.329725 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" event={"ID":"b8b39e58-fac2-4827-8bee-b3e6e11c1a43","Type":"ContainerStarted","Data":"bb8e6a1b9d155d0e68935e44558e6871538e67c7c329b4ac3eeee353879ede4f"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.329791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" event={"ID":"b8b39e58-fac2-4827-8bee-b3e6e11c1a43","Type":"ContainerStarted","Data":"2350747d541f6d96f03742921b2c4213b1f7d32f4c7b33243ea4d6bfdbf73a5f"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.357501 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" event={"ID":"fdbd7bc3-cca1-4368-814a-126ba13a4f8e","Type":"ContainerStarted","Data":"e85631e9b673b4601a64349c2376b3e4ed4a67a30addf37df3d39fcf0224842c"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.357581 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" event={"ID":"fdbd7bc3-cca1-4368-814a-126ba13a4f8e","Type":"ContainerStarted","Data":"faa1cd79b41af4f89c1219691ce112340b668842b4babfa51ff32c9a0f75dfaf"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.367420 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" event={"ID":"645c4439-d2f2-44a5-a115-566f5c436729","Type":"ContainerStarted","Data":"32c2e63cc06219f4407a2ab48b3180f1f7dd77a72a9597c0ae03901e8057fcbb"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.367488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" event={"ID":"645c4439-d2f2-44a5-a115-566f5c436729","Type":"ContainerStarted","Data":"25dcff40dcfc11b873bf6eabb8be18d2da98599573f8a3a40e5e1c75b71062f0"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.368814 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.370301 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" event={"ID":"b865046e-88e6-4c7b-b67e-9d7342619f52","Type":"ContainerStarted","Data":"afbc716180e640336fcdeee16efec10672cd822ea105e7f09e9ce3eb29de537a"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.370379 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" event={"ID":"b865046e-88e6-4c7b-b67e-9d7342619f52","Type":"ContainerStarted","Data":"9136ac89d049b14f5d6c411c69b6be6b85d8f8499b9eff574a35d190f0c239b0"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.371554 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" event={"ID":"da66d59c-5026-4b0f-bbda-c064af724a28","Type":"ContainerStarted","Data":"c98d4165f734f107d23ff2e4c9a9f79d8c2bc3e970831cebe7e24438d0fc6511"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.371587 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" event={"ID":"da66d59c-5026-4b0f-bbda-c064af724a28","Type":"ContainerStarted","Data":"3c15579a14f1bcdb1b8ae41abf8cfd372115eafb1e781ae7232c28bd6bdb92e4"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.373813 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" event={"ID":"80cdd4b8-9eeb-42d5-ad43-441e1037f5c4","Type":"ContainerStarted","Data":"e5c584474fd44f14624782f70e1b74e84da9ed3faacb838f1c75c285c19295ee"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.375935 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" event={"ID":"8f95c4be-368d-4bf1-a109-ee96a5da7491","Type":"ContainerStarted","Data":"740b6f17cfab49a93b5daaea6716f0a5338e6808612d6af0d0325775e41f74c9"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.375968 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" event={"ID":"8f95c4be-368d-4bf1-a109-ee96a5da7491","Type":"ContainerStarted","Data":"982bf70a185147214ad592bcdd95f060af881c3812f70e5b69689673d3eeacdc"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.379355 4760 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cs9z4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.379456 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" podUID="645c4439-d2f2-44a5-a115-566f5c436729" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.381807 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" event={"ID":"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0","Type":"ContainerStarted","Data":"9bf669ac9023994bb14325c673087aa4cb6a2eb0c475d77c77268754cbdb8f61"} Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.462000 4760 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8l6bv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.462083 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" podUID="1c03511a-3670-4fe4-929f-5ceaa64d0cb4" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.463427 4760 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qvdzl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.463470 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.463523 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.466935 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.467915 4760 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pqn2r container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.468031 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" podUID="e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.470339 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:04 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:04 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:04 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.470403 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.478511 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nwwhw" podStartSLOduration=140.478490499 podStartE2EDuration="2m20.478490499s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:04.477644231 +0000 UTC m=+167.519090798" watchObservedRunningTime="2025-12-04 12:16:04.478490499 +0000 UTC m=+167.519937076" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.485134 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" podStartSLOduration=140.485111616 podStartE2EDuration="2m20.485111616s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:04.359854698 +0000 UTC m=+167.401301265" watchObservedRunningTime="2025-12-04 12:16:04.485111616 +0000 UTC m=+167.526558183" Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.486905 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:04.986887924 +0000 UTC m=+168.028334481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.564689 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" podStartSLOduration=140.564671215 podStartE2EDuration="2m20.564671215s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:04.563727915 +0000 UTC m=+167.605174482" watchObservedRunningTime="2025-12-04 12:16:04.564671215 +0000 UTC m=+167.606117782" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.585715 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.588161 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.088127495 +0000 UTC m=+168.129574182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.699480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.700072 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.200051416 +0000 UTC m=+168.241497983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.800968 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.802411 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.302360302 +0000 UTC m=+168.343806869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.868128 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" podStartSLOduration=64.868084758 podStartE2EDuration="1m4.868084758s" podCreationTimestamp="2025-12-04 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:04.775948676 +0000 UTC m=+167.817395243" watchObservedRunningTime="2025-12-04 12:16:04.868084758 +0000 UTC m=+167.909531325" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.908367 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:04 crc kubenswrapper[4760]: E1204 12:16:04.909119 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.409100813 +0000 UTC m=+168.450547380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.915548 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.915631 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.997449 4760 patch_prober.go:28] interesting pod/apiserver-76f77b778f-jz7p7 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]log ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]etcd ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/max-in-flight-filter ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 04 12:16:04 crc kubenswrapper[4760]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 04 12:16:04 crc kubenswrapper[4760]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/project.openshift.io-projectcache ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-startinformers ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 04 12:16:04 crc kubenswrapper[4760]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 12:16:04 crc kubenswrapper[4760]: livez check failed Dec 04 12:16:04 crc kubenswrapper[4760]: I1204 12:16:04.997546 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" podUID="2914d4c6-5bae-4d02-aaf6-13556172e946" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.030568 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.098040 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.098372 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.598339791 +0000 UTC m=+168.639786358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.098448 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.098789 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.598778725 +0000 UTC m=+168.640225292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.151119 4760 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pqn2r container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.151711 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" podUID="e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.151175 4760 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-pqn2r container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.151806 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" podUID="e2a04888-17e2-4e76-aa2a-6d1fe1ccfd0f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.170899 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8qssb" podStartSLOduration=141.17087312 podStartE2EDuration="2m21.17087312s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.108037928 +0000 UTC m=+168.149484495" watchObservedRunningTime="2025-12-04 12:16:05.17087312 +0000 UTC m=+168.212319687" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.200028 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.200677 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.700651896 +0000 UTC m=+168.742098463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.295460 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" podStartSLOduration=141.295423615 podStartE2EDuration="2m21.295423615s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.171937724 +0000 UTC m=+168.213384301" watchObservedRunningTime="2025-12-04 12:16:05.295423615 +0000 UTC m=+168.336870182" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.296082 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8jnmg" podStartSLOduration=141.296070106 podStartE2EDuration="2m21.296070106s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.291936781 +0000 UTC m=+168.333383348" watchObservedRunningTime="2025-12-04 12:16:05.296070106 +0000 UTC m=+168.337516673" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.302162 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.302933 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.80289545 +0000 UTC m=+168.844342077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.408692 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.409156 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:05.909126355 +0000 UTC m=+168.950572922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.433692 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" event={"ID":"b8b39e58-fac2-4827-8bee-b3e6e11c1a43","Type":"ContainerStarted","Data":"948d61c25578f35e2385eb38c85bf092a1e45fe431a33d9b5c9063eb53b5b011"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.435301 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.441470 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:05 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:05 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:05 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.441564 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.454956 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" event={"ID":"2623f14b-9edc-48cd-aeba-08cc1155890f","Type":"ContainerStarted","Data":"ca86a0a08074f511cb6d555114846e7228c05821a9288c44721c395925a27f89"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.458355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" event={"ID":"b865046e-88e6-4c7b-b67e-9d7342619f52","Type":"ContainerStarted","Data":"682395ad6bec4aef1205d350665d482eb1937d5fc9e1fbdbbddce520167976b6"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.465403 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9rr2" event={"ID":"55feea70-8c8d-41ad-afd1-95f0ad24faf0","Type":"ContainerStarted","Data":"68d93d82078c20eb24a1513c9a5036d453180a9f77f66545d2c2bbf1d379a8fd"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.466619 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f9rr2" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.483683 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" event={"ID":"8f95c4be-368d-4bf1-a109-ee96a5da7491","Type":"ContainerStarted","Data":"6f7840b47385b12e305eff3fcdf3718e97e03b83ac99f44362fbbe95be45b653"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.510431 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.511422 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.011400999 +0000 UTC m=+169.052847556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.569093 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" podStartSLOduration=141.569066442 podStartE2EDuration="2m21.569066442s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.475017556 +0000 UTC m=+168.516464113" watchObservedRunningTime="2025-12-04 12:16:05.569066442 +0000 UTC m=+168.610512999" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.571858 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f9rr2" podStartSLOduration=11.571849443 podStartE2EDuration="11.571849443s" podCreationTimestamp="2025-12-04 12:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.570682914 +0000 UTC m=+168.612129481" watchObservedRunningTime="2025-12-04 12:16:05.571849443 +0000 UTC m=+168.613296010" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.638595 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zqr7n" event={"ID":"0341ec2d-564b-46ec-a993-625f66f899c8","Type":"ContainerStarted","Data":"d3e601a09de626b40aabce4f925e098926763ac0c8087a9f2706276e9c4a936f"} Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.651594 4760 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jw2zk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.651681 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" podUID="89ac52b9-3fe5-4f64-b128-f88467ed56d4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.652523 4760 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cs9z4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.652571 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" podUID="645c4439-d2f2-44a5-a115-566f5c436729" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.653146 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.655037 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.15499781 +0000 UTC m=+169.196444377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.664444 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.673101 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.173037121 +0000 UTC m=+169.214483708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.691898 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8l6bv" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.702455 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q2tmn" podStartSLOduration=141.702419715 podStartE2EDuration="2m21.702419715s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.70106286 +0000 UTC m=+168.742509447" watchObservedRunningTime="2025-12-04 12:16:05.702419715 +0000 UTC m=+168.743866282" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.723363 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6brwl" podStartSLOduration=141.723330031 podStartE2EDuration="2m21.723330031s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:05.602408755 +0000 UTC m=+168.643855352" watchObservedRunningTime="2025-12-04 12:16:05.723330031 +0000 UTC m=+168.764776598" Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.787084 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.788613 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.288576121 +0000 UTC m=+169.330022688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.890627 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:05 crc kubenswrapper[4760]: E1204 12:16:05.892660 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.392640844 +0000 UTC m=+169.434087401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:05 crc kubenswrapper[4760]: I1204 12:16:05.992690 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:05.993017 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.493001506 +0000 UTC m=+169.534448073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.148632 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.148687 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.149458 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.149979 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.649965535 +0000 UTC m=+169.691412102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.157900 4760 patch_prober.go:28] interesting pod/console-f9d7485db-5lxbp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.158002 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5lxbp" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.158694 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.158886 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.159839 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.159921 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.250442 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.250566 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.750542124 +0000 UTC m=+169.791988691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.250865 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.251899 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.751886178 +0000 UTC m=+169.793332745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.352615 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.353313 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.853234853 +0000 UTC m=+169.894681420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.434259 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:06 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:06 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:06 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.434361 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.460401 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.460754 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:06.960735948 +0000 UTC m=+170.002182515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.561928 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.061911547 +0000 UTC m=+170.103358114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.561802 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.562489 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.062464746 +0000 UTC m=+170.103911313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.562516 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.646012 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" event={"ID":"96a8ab6f-5954-48fc-bc24-738807b91ea4","Type":"ContainerStarted","Data":"e9cf36329c37e1ee8118faa9ab11a760ea0ba2cd3415a66f8b9a6a7487fd95e5"} Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.664319 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.664369 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cs9z4" Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.664626 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.164599465 +0000 UTC m=+170.206046032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.687827 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-zpxt2" podStartSLOduration=142.687802427 podStartE2EDuration="2m22.687802427s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:06.685245273 +0000 UTC m=+169.726691840" watchObservedRunningTime="2025-12-04 12:16:06.687802427 +0000 UTC m=+169.729248994" Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.767430 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.769058 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.269041161 +0000 UTC m=+170.310487728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:06 crc kubenswrapper[4760]: I1204 12:16:06.868714 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:06 crc kubenswrapper[4760]: E1204 12:16:06.869086 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.369070092 +0000 UTC m=+170.410516659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.016653 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.017512 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.517494461 +0000 UTC m=+170.558941028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.118672 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.119124 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.619107164 +0000 UTC m=+170.660553731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.211817 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.227823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.228341 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.728325096 +0000 UTC m=+170.769771663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.328838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.329286 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.829262877 +0000 UTC m=+170.870709454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.329521 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.330051 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.830017892 +0000 UTC m=+170.871464459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.430133 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.430558 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.430638 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.930618022 +0000 UTC m=+170.972064589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.431459 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.431835 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:07.931823431 +0000 UTC m=+170.973269998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.436444 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:07 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:07 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:07 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.436515 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.477123 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.477937 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.486693 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jw2zk" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.500884 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.501065 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.524111 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.533110 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.533338 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.03329668 +0000 UTC m=+171.074743237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.533412 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.533600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.533854 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.535626 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.035603185 +0000 UTC m=+171.077049752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.667284 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.668101 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.168080911 +0000 UTC m=+171.209527488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.668156 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.668241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.668324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.668686 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.168676061 +0000 UTC m=+171.210122638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.668744 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.705078 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" event={"ID":"2623f14b-9edc-48cd-aeba-08cc1155890f","Type":"ContainerStarted","Data":"a1bd73cd8f79354ce4b3547ad11c393c1b9ae9304db7355017a3a60a4f8dea5f"} Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.769579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.769812 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.269759496 +0000 UTC m=+171.311206063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.770654 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.772782 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.272761184 +0000 UTC m=+171.314207751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.960956 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.961292 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:16:07 crc kubenswrapper[4760]: E1204 12:16:07.962554 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.462493788 +0000 UTC m=+171.503940355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:07 crc kubenswrapper[4760]: I1204 12:16:07.975611 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4fd6a47-556a-4236-9f60-0e7996e4608a-metrics-certs\") pod \"network-metrics-daemon-xpngr\" (UID: \"b4fd6a47-556a-4236-9f60-0e7996e4608a\") " pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.043196 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.071626 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.072112 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.572087563 +0000 UTC m=+171.613534130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.097691 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xpngr" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.145695 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.219620 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.220021 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.719988774 +0000 UTC m=+171.761435351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.220252 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-pqn2r" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.471167 4760 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.478257 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.478728 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:08.97871365 +0000 UTC m=+172.020160217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.491832 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:08 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:08 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:08 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.491896 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.584813 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.585146 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.08511926 +0000 UTC m=+172.126565857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.686157 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.686506 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.186494036 +0000 UTC m=+172.227940603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.720753 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" event={"ID":"2623f14b-9edc-48cd-aeba-08cc1155890f","Type":"ContainerStarted","Data":"38fe9f5b5c9c94b88f210acef458e7f5b3aea4080029bed9ef0693d177e925c0"} Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.787349 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.787571 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.28753905 +0000 UTC m=+172.328985627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.787747 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.788148 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.28814182 +0000 UTC m=+172.329588387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.889935 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.890631 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.390613951 +0000 UTC m=+172.432060518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.922943 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.935848 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.935992 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.977509 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.991472 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.991524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.991725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9pv2\" (UniqueName: \"kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:08 crc kubenswrapper[4760]: I1204 12:16:08.991806 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:08 crc kubenswrapper[4760]: E1204 12:16:08.991892 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.491876763 +0000 UTC m=+172.533323410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.092903 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.093110 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9pv2\" (UniqueName: \"kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: E1204 12:16:09.093169 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.593136624 +0000 UTC m=+172.634583191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.093238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.093442 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.093503 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.093951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.094053 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: E1204 12:16:09.094203 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.594185428 +0000 UTC m=+172.635631995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.116540 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.117853 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.119973 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.138241 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9pv2\" (UniqueName: \"kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2\") pod \"community-operators-94pdb\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.151755 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.161560 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.213197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.214271 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7mw\" (UniqueName: \"kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.214316 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: E1204 12:16:09.214554 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.714395202 +0000 UTC m=+172.755841819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.214685 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.214838 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: E1204 12:16:09.215498 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 12:16:09.715490057 +0000 UTC m=+172.756936624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cwwhx" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.221117 4760 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T12:16:08.475185225Z","Handler":null,"Name":""} Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.245064 4760 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.245119 4760 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.307887 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.316795 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.317143 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.317200 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7mw\" (UniqueName: \"kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.317242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.321010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.321069 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.328654 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.331461 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.334299 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.351971 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.357310 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7mw\" (UniqueName: \"kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw\") pod \"certified-operators-bptn9\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.383742 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xpngr"] Dec 04 12:16:09 crc kubenswrapper[4760]: W1204 12:16:09.410867 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4fd6a47_556a_4236_9f60_0e7996e4608a.slice/crio-9c0085b40a1c21db7760277724e50add53ed2c9e337f773d3ae43c4ebee04207 WatchSource:0}: Error finding container 9c0085b40a1c21db7760277724e50add53ed2c9e337f773d3ae43c4ebee04207: Status 404 returned error can't find the container with id 9c0085b40a1c21db7760277724e50add53ed2c9e337f773d3ae43c4ebee04207 Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.419584 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhnj\" (UniqueName: \"kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.419849 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.419993 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.420167 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.428017 4760 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.428071 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.436514 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:09 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:09 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:09 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.436596 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.493935 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.500152 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.513485 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.524283 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhnj\" (UniqueName: \"kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.524352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.524394 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.525111 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.525411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.531415 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.570339 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhnj\" (UniqueName: \"kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj\") pod \"community-operators-rpnt8\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.571335 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cwwhx\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.626370 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhrg\" (UniqueName: \"kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.626464 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.626570 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.679891 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.728600 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhrg\" (UniqueName: \"kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.728750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.728843 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.729767 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.729942 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.754062 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.754708 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhrg\" (UniqueName: \"kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg\") pod \"certified-operators-l9t6h\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.754982 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" event={"ID":"2623f14b-9edc-48cd-aeba-08cc1155890f","Type":"ContainerStarted","Data":"9ad253ea7936af3ae90771f2588343b2d9609a86f77004cee3a69795962f340a"} Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.756861 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xpngr" event={"ID":"b4fd6a47-556a-4236-9f60-0e7996e4608a","Type":"ContainerStarted","Data":"9c0085b40a1c21db7760277724e50add53ed2c9e337f773d3ae43c4ebee04207"} Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.758935 4760 generic.go:334] "Generic (PLEG): container finished" podID="dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" containerID="9bf669ac9023994bb14325c673087aa4cb6a2eb0c475d77c77268754cbdb8f61" exitCode=0 Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.759001 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" event={"ID":"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0","Type":"ContainerDied","Data":"9bf669ac9023994bb14325c673087aa4cb6a2eb0c475d77c77268754cbdb8f61"} Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.762673 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003","Type":"ContainerStarted","Data":"add883f105bc832e8952aec35da1b51a2715e888dbbe36118335fd8ce0a0f649"} Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.786574 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" podStartSLOduration=15.786540149 podStartE2EDuration="15.786540149s" podCreationTimestamp="2025-12-04 12:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:09.779980164 +0000 UTC m=+172.821426731" watchObservedRunningTime="2025-12-04 12:16:09.786540149 +0000 UTC m=+172.827986716" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.854650 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.895826 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.927998 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:16:09 crc kubenswrapper[4760]: I1204 12:16:09.957126 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-jz7p7" Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.205859 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.272973 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.438316 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:10 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:10 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:10 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.438789 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.641808 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:16:10 crc kubenswrapper[4760]: W1204 12:16:10.668581 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1081e99d_fbd4_4c6d_a1b5_19613da9ac2b.slice/crio-d404149401a41b920aeb26138ed81ab5d8828e1d1876b302e042ff1c1d62b329 WatchSource:0}: Error finding container d404149401a41b920aeb26138ed81ab5d8828e1d1876b302e042ff1c1d62b329: Status 404 returned error can't find the container with id d404149401a41b920aeb26138ed81ab5d8828e1d1876b302e042ff1c1d62b329 Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.686077 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:16:10 crc kubenswrapper[4760]: W1204 12:16:10.717444 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ebb09b_1c59_4289_92f0_847b3c655fa9.slice/crio-b86859b48d41b8ef014530a58e3298e33079c694d07a39ccc07d8df57a61a9d2 WatchSource:0}: Error finding container b86859b48d41b8ef014530a58e3298e33079c694d07a39ccc07d8df57a61a9d2: Status 404 returned error can't find the container with id b86859b48d41b8ef014530a58e3298e33079c694d07a39ccc07d8df57a61a9d2 Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.775534 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerStarted","Data":"57b56145cf76e6e35fca32e8addf4fec11b79d992cacfb8981b2a168e9937709"} Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.778400 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xpngr" event={"ID":"b4fd6a47-556a-4236-9f60-0e7996e4608a","Type":"ContainerStarted","Data":"83ed052cdd179b034fb964c9e6cd6c7f9b55bb514d64a9de4b144f5edbbe7156"} Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.779322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerStarted","Data":"386af51fa8df4a09f189354314363334676aad7bb7490e2fc452010cea258f8a"} Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.780780 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" event={"ID":"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b","Type":"ContainerStarted","Data":"d404149401a41b920aeb26138ed81ab5d8828e1d1876b302e042ff1c1d62b329"} Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.782340 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerStarted","Data":"b86859b48d41b8ef014530a58e3298e33079c694d07a39ccc07d8df57a61a9d2"} Dec 04 12:16:10 crc kubenswrapper[4760]: I1204 12:16:10.784797 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003","Type":"ContainerStarted","Data":"64149d3eaca737dc3f1b2bc1bee84593613415cb316ffa2c7c8aec60658726a4"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.079715 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.079695555 podStartE2EDuration="4.079695555s" podCreationTimestamp="2025-12-04 12:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:11.068948012 +0000 UTC m=+174.110394579" watchObservedRunningTime="2025-12-04 12:16:11.079695555 +0000 UTC m=+174.121142122" Dec 04 12:16:11 crc kubenswrapper[4760]: W1204 12:16:11.095578 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d9a363c_3a21_44a8_aeb0_720692d8ee7f.slice/crio-d0d456f6813fb7618cb2fa3e1d2f149698da3c2caf32c0843f45772b543d5786 WatchSource:0}: Error finding container d0d456f6813fb7618cb2fa3e1d2f149698da3c2caf32c0843f45772b543d5786: Status 404 returned error can't find the container with id d0d456f6813fb7618cb2fa3e1d2f149698da3c2caf32c0843f45772b543d5786 Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.098174 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.158727 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.160119 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.168748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.281839 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.281945 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.281998 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8rnm\" (UniqueName: \"kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.385948 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.386024 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8rnm\" (UniqueName: \"kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.386118 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.386753 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.387079 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.415738 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.444370 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:11 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:11 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:11 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.444466 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.451612 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8rnm\" (UniqueName: \"kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm\") pod \"redhat-marketplace-g48rj\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.509730 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.511106 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.530805 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.598570 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.598626 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7kvz\" (UniqueName: \"kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.598727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.675614 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.686805 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.701877 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume\") pod \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.701973 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume\") pod \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.702049 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jshpm\" (UniqueName: \"kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm\") pod \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\" (UID: \"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0\") " Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.702305 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.702342 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7kvz\" (UniqueName: \"kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.702456 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.702978 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.703377 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" (UID: "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.703846 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.714520 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm" (OuterVolumeSpecName: "kube-api-access-jshpm") pod "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" (UID: "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0"). InnerVolumeSpecName "kube-api-access-jshpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.716249 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" (UID: "dbd54983-b7fb-4b28-93fd-d2b9d5b881f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.728050 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7kvz\" (UniqueName: \"kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz\") pod \"redhat-marketplace-lldkh\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.791177 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerStarted","Data":"d0d456f6813fb7618cb2fa3e1d2f149698da3c2caf32c0843f45772b543d5786"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.794850 4760 generic.go:334] "Generic (PLEG): container finished" podID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerID="c5ef78da41e1d274ca0e9fe46d11797497f9cc1ce8f97d77b4b39e3f1af7d57a" exitCode=0 Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.794923 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerDied","Data":"c5ef78da41e1d274ca0e9fe46d11797497f9cc1ce8f97d77b4b39e3f1af7d57a"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.796590 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.797121 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xpngr" event={"ID":"b4fd6a47-556a-4236-9f60-0e7996e4608a","Type":"ContainerStarted","Data":"3e0cfcc29fe628925f17b86b47a5c20376672b7a50dacd621b7aca6e9cc0a2ea"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.800069 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" event={"ID":"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b","Type":"ContainerStarted","Data":"8ca9cfa98038a06ceec07bb193045ae851539156112b8d8efa1bd65d7740cfa9"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.800194 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.805179 4760 generic.go:334] "Generic (PLEG): container finished" podID="6ea41f2f-f148-4280-b046-1bea756a117a" containerID="fbaff96097442a615946b8877549b35b4f44b18bac0cbcd80208e596bd05e52a" exitCode=0 Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.805413 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerDied","Data":"fbaff96097442a615946b8877549b35b4f44b18bac0cbcd80208e596bd05e52a"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.805708 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.805734 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.805747 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jshpm\" (UniqueName: \"kubernetes.io/projected/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0-kube-api-access-jshpm\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.807943 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" event={"ID":"dbd54983-b7fb-4b28-93fd-d2b9d5b881f0","Type":"ContainerDied","Data":"ec7e0605097284c5fedb4eaac78e880df0ace483ff13ff8e81a6ca575103ec2e"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.807965 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7e0605097284c5fedb4eaac78e880df0ace483ff13ff8e81a6ca575103ec2e" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.808021 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.810004 4760 generic.go:334] "Generic (PLEG): container finished" podID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerID="3dd69ee3c13029103989b75db06b5be2d7000180ef8b8f9e76c8b6491ff7645b" exitCode=0 Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.810223 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerDied","Data":"3dd69ee3c13029103989b75db06b5be2d7000180ef8b8f9e76c8b6491ff7645b"} Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.846298 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xpngr" podStartSLOduration=147.846273339 podStartE2EDuration="2m27.846273339s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:11.845761243 +0000 UTC m=+174.887207810" watchObservedRunningTime="2025-12-04 12:16:11.846273339 +0000 UTC m=+174.887719926" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.906793 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" podStartSLOduration=147.906775304 podStartE2EDuration="2m27.906775304s" podCreationTimestamp="2025-12-04 12:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:11.90573843 +0000 UTC m=+174.947184997" watchObservedRunningTime="2025-12-04 12:16:11.906775304 +0000 UTC m=+174.948221871" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.972634 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:16:11 crc kubenswrapper[4760]: I1204 12:16:11.991185 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.105138 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:16:12 crc kubenswrapper[4760]: E1204 12:16:12.105451 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" containerName="collect-profiles" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.105469 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" containerName="collect-profiles" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.105645 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" containerName="collect-profiles" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.106751 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.112446 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.114406 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.114519 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zzq\" (UniqueName: \"kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.114701 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.122807 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.215554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.215664 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.215689 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zzq\" (UniqueName: \"kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.216634 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.216948 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.283658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zzq\" (UniqueName: \"kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq\") pod \"redhat-operators-pt6wf\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.293972 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.295188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.310628 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.369658 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.648341 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.648413 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq7j4\" (UniqueName: \"kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.648465 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.652105 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:12 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:12 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:12 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.652189 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.655910 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f9rr2" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.750611 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.750787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq7j4\" (UniqueName: \"kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.750850 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.751725 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:12 crc kubenswrapper[4760]: I1204 12:16:12.753380 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:12.973186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq7j4\" (UniqueName: \"kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4\") pod \"redhat-operators-ql8tz\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:12.987838 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerStarted","Data":"fa2d0e952155b79cccb3c17739adc5a4aae8c4031ba7b49fe40da7408bbe7234"} Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:12.990463 4760 generic.go:334] "Generic (PLEG): container finished" podID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerID="7c34bd6de2857c66edcb3c879077c02de303f2a48a4e08776b7b3fef34b60f93" exitCode=0 Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:12.992080 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerDied","Data":"7c34bd6de2857c66edcb3c879077c02de303f2a48a4e08776b7b3fef34b60f93"} Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:13.234668 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:13.626177 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:13 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:13 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:13 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:13.626346 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:13 crc kubenswrapper[4760]: I1204 12:16:13.725519 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:16:14 crc kubenswrapper[4760]: I1204 12:16:14.036307 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerStarted","Data":"2a59bd0c2604ac0429f134badaeaf8846f1835c01facdff11983413229c7dc61"} Dec 04 12:16:14 crc kubenswrapper[4760]: I1204 12:16:14.323826 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:16:14 crc kubenswrapper[4760]: I1204 12:16:14.623838 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:14 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:14 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:14 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:14 crc kubenswrapper[4760]: I1204 12:16:14.624458 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.060517 4760 generic.go:334] "Generic (PLEG): container finished" podID="d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" containerID="64149d3eaca737dc3f1b2bc1bee84593613415cb316ffa2c7c8aec60658726a4" exitCode=0 Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.061010 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003","Type":"ContainerDied","Data":"64149d3eaca737dc3f1b2bc1bee84593613415cb316ffa2c7c8aec60658726a4"} Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.083516 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.093011 4760 generic.go:334] "Generic (PLEG): container finished" podID="2a7dec90-d501-40e2-9338-df345c0fd672" containerID="2a59bd0c2604ac0429f134badaeaf8846f1835c01facdff11983413229c7dc61" exitCode=0 Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.093279 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerDied","Data":"2a59bd0c2604ac0429f134badaeaf8846f1835c01facdff11983413229c7dc61"} Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.183554 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerStarted","Data":"df6265cf23803295c0706cfb160372eab67fcf95009963d2f24d81e7fbd62880"} Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.280507 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.282379 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.285997 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.287566 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.372701 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.377668 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.377825 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.451579 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:15 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:15 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:15 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.451694 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.485456 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.485591 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.485769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.598646 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.619258 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:16:15 crc kubenswrapper[4760]: I1204 12:16:15.755676 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.424561 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.424682 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.425245 4760 patch_prober.go:28] interesting pod/console-f9d7485db-5lxbp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.425339 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5lxbp" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.425647 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.425718 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.435080 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:16 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:16 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:16 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.435168 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.671812 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerID="85d944f1c881f8197da5916e6ec72d5f739ef6c742ca85db3f039de267c54fa8" exitCode=0 Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.672385 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerDied","Data":"85d944f1c881f8197da5916e6ec72d5f739ef6c742ca85db3f039de267c54fa8"} Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.679373 4760 generic.go:334] "Generic (PLEG): container finished" podID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerID="639ba6555c8b98855087ce53fa165a12f748d0956794344638c4a4db0deb5094" exitCode=0 Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.679463 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerDied","Data":"639ba6555c8b98855087ce53fa165a12f748d0956794344638c4a4db0deb5094"} Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.679505 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerStarted","Data":"213aec7054610f353d35bab5ab3f2799e54de8b4765e3cfaa1aef9fe4d2af680"} Dec 04 12:16:16 crc kubenswrapper[4760]: I1204 12:16:16.681265 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql8tz" event={"ID":"78261f38-564b-4487-a29e-5edc6859825e","Type":"ContainerStarted","Data":"7868838edecec981a84849f76ae53dc84295c6dffbcb0a72cc940089631f2864"} Dec 04 12:16:17 crc kubenswrapper[4760]: I1204 12:16:17.520273 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:17 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:17 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:17 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:17 crc kubenswrapper[4760]: I1204 12:16:17.520372 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:18 crc kubenswrapper[4760]: I1204 12:16:18.524651 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:18 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:18 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:18 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:18 crc kubenswrapper[4760]: I1204 12:16:18.526155 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:18 crc kubenswrapper[4760]: I1204 12:16:18.751357 4760 generic.go:334] "Generic (PLEG): container finished" podID="78261f38-564b-4487-a29e-5edc6859825e" containerID="43709f0621c9ab89ea7b79b089e8a59039df9b4ca6f5627c5f24e46aebb4b0e8" exitCode=0 Dec 04 12:16:18 crc kubenswrapper[4760]: I1204 12:16:18.751998 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql8tz" event={"ID":"78261f38-564b-4487-a29e-5edc6859825e","Type":"ContainerDied","Data":"43709f0621c9ab89ea7b79b089e8a59039df9b4ca6f5627c5f24e46aebb4b0e8"} Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.165938 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.453607 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:19 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:19 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:19 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.454291 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.709732 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.948519 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir\") pod \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.948659 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access\") pod \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\" (UID: \"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003\") " Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.948837 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" (UID: "d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:16:19 crc kubenswrapper[4760]: I1204 12:16:19.949061 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.058702 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" (UID: "d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.076948 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0","Type":"ContainerStarted","Data":"dfc4626f65b02fe89a0f75027b8fac8a72e96222d1f05df63dea479372495c56"} Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.094526 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003","Type":"ContainerDied","Data":"add883f105bc832e8952aec35da1b51a2715e888dbbe36118335fd8ce0a0f649"} Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.094580 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="add883f105bc832e8952aec35da1b51a2715e888dbbe36118335fd8ce0a0f649" Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.094635 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.175813 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.472039 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:20 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:20 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:20 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:20 crc kubenswrapper[4760]: I1204 12:16:20.472864 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:20 crc kubenswrapper[4760]: E1204 12:16:20.578975 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podd1ef2149_e0b7_4a1a_9ffa_d009a3c6e003.slice/crio-add883f105bc832e8952aec35da1b51a2715e888dbbe36118335fd8ce0a0f649\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-podd1ef2149_e0b7_4a1a_9ffa_d009a3c6e003.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:16:21 crc kubenswrapper[4760]: I1204 12:16:21.138050 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0","Type":"ContainerStarted","Data":"cfe7978e6ff97fa26ca8be5a95a7dcf5ee0d4fe8774b25fc1b4b6ae3a2211e3b"} Dec 04 12:16:21 crc kubenswrapper[4760]: I1204 12:16:21.210008 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=6.20998532 podStartE2EDuration="6.20998532s" podCreationTimestamp="2025-12-04 12:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:16:21.206845777 +0000 UTC m=+184.248292364" watchObservedRunningTime="2025-12-04 12:16:21.20998532 +0000 UTC m=+184.251431907" Dec 04 12:16:21 crc kubenswrapper[4760]: I1204 12:16:21.435079 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:21 crc kubenswrapper[4760]: [-]has-synced failed: reason withheld Dec 04 12:16:21 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:21 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:21 crc kubenswrapper[4760]: I1204 12:16:21.435166 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:22 crc kubenswrapper[4760]: I1204 12:16:22.432519 4760 patch_prober.go:28] interesting pod/router-default-5444994796-42rhb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 12:16:22 crc kubenswrapper[4760]: [+]has-synced ok Dec 04 12:16:22 crc kubenswrapper[4760]: [+]process-running ok Dec 04 12:16:22 crc kubenswrapper[4760]: healthz check failed Dec 04 12:16:22 crc kubenswrapper[4760]: I1204 12:16:22.432633 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-42rhb" podUID="ca0dbe61-a854-4ab3-90af-4404e679cd68" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:16:23 crc kubenswrapper[4760]: I1204 12:16:23.354315 4760 generic.go:334] "Generic (PLEG): container finished" podID="2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" containerID="cfe7978e6ff97fa26ca8be5a95a7dcf5ee0d4fe8774b25fc1b4b6ae3a2211e3b" exitCode=0 Dec 04 12:16:23 crc kubenswrapper[4760]: I1204 12:16:23.354424 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0","Type":"ContainerDied","Data":"cfe7978e6ff97fa26ca8be5a95a7dcf5ee0d4fe8774b25fc1b4b6ae3a2211e3b"} Dec 04 12:16:23 crc kubenswrapper[4760]: I1204 12:16:23.437716 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:16:23 crc kubenswrapper[4760]: I1204 12:16:23.447497 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-42rhb" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.001079 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.147911 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.153263 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.154676 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.154744 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.155241 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.155263 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.155310 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.156057 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00"} pod="openshift-console/downloads-7954f5f757-dn2jn" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.156167 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" containerID="cri-o://49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00" gracePeriod=2 Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.156785 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.156811 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.185404 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir\") pod \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.185494 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access\") pod \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\" (UID: \"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0\") " Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.186428 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" (UID: "2aebbe1a-dd84-4a38-8da3-98549fc2d6e0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.197629 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" (UID: "2aebbe1a-dd84-4a38-8da3-98549fc2d6e0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.287792 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.287847 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aebbe1a-dd84-4a38-8da3-98549fc2d6e0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.413013 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.415415 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"2aebbe1a-dd84-4a38-8da3-98549fc2d6e0","Type":"ContainerDied","Data":"dfc4626f65b02fe89a0f75027b8fac8a72e96222d1f05df63dea479372495c56"} Dec 04 12:16:26 crc kubenswrapper[4760]: I1204 12:16:26.415463 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc4626f65b02fe89a0f75027b8fac8a72e96222d1f05df63dea479372495c56" Dec 04 12:16:27 crc kubenswrapper[4760]: I1204 12:16:27.433195 4760 generic.go:334] "Generic (PLEG): container finished" podID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerID="49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00" exitCode=0 Dec 04 12:16:27 crc kubenswrapper[4760]: I1204 12:16:27.433489 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerDied","Data":"49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00"} Dec 04 12:16:28 crc kubenswrapper[4760]: I1204 12:16:28.448593 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerStarted","Data":"4d796614541ee3148d22a5548d422272d88770bb2b3aae44ffc27f5c1f274ba3"} Dec 04 12:16:28 crc kubenswrapper[4760]: I1204 12:16:28.449639 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:16:28 crc kubenswrapper[4760]: I1204 12:16:28.449942 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:28 crc kubenswrapper[4760]: I1204 12:16:28.449997 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:29 crc kubenswrapper[4760]: I1204 12:16:29.586090 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:29 crc kubenswrapper[4760]: I1204 12:16:29.586140 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:29 crc kubenswrapper[4760]: I1204 12:16:29.688834 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:16:31 crc kubenswrapper[4760]: I1204 12:16:31.613026 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 12:16:33 crc kubenswrapper[4760]: I1204 12:16:33.380558 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:16:33 crc kubenswrapper[4760]: I1204 12:16:33.383682 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:16:36 crc kubenswrapper[4760]: I1204 12:16:36.154606 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:36 crc kubenswrapper[4760]: I1204 12:16:36.154654 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:36 crc kubenswrapper[4760]: I1204 12:16:36.154747 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:36 crc kubenswrapper[4760]: I1204 12:16:36.154760 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:37 crc kubenswrapper[4760]: I1204 12:16:37.485678 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nqzbp" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.671404 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 12:16:45 crc kubenswrapper[4760]: E1204 12:16:45.676035 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.676107 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: E1204 12:16:45.676134 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.676142 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.676395 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ef2149-e0b7-4a1a-9ffa-d009a3c6e003" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.676427 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aebbe1a-dd84-4a38-8da3-98549fc2d6e0" containerName="pruner" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.677473 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.688300 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.688724 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.692692 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.695969 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.698697 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.790506 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.790615 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.790749 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:45 crc kubenswrapper[4760]: I1204 12:16:45.821357 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:46 crc kubenswrapper[4760]: I1204 12:16:46.004612 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:16:46 crc kubenswrapper[4760]: I1204 12:16:46.154529 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:46 crc kubenswrapper[4760]: I1204 12:16:46.154605 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:46 crc kubenswrapper[4760]: I1204 12:16:46.154748 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:46 crc kubenswrapper[4760]: I1204 12:16:46.154637 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.272589 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.274451 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.277955 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.449568 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.449657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.449682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.551465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.551563 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.551603 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.551609 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.551734 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.577870 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access\") pod \"installer-9-crc\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:51 crc kubenswrapper[4760]: I1204 12:16:51.630341 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.209019 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.209624 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.209177 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.209688 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.209744 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.210538 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"4d796614541ee3148d22a5548d422272d88770bb2b3aae44ffc27f5c1f274ba3"} pod="openshift-console/downloads-7954f5f757-dn2jn" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.210583 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" containerID="cri-o://4d796614541ee3148d22a5548d422272d88770bb2b3aae44ffc27f5c1f274ba3" gracePeriod=2 Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.210801 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:16:56 crc kubenswrapper[4760]: I1204 12:16:56.210842 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:16:59 crc kubenswrapper[4760]: I1204 12:16:59.081986 4760 generic.go:334] "Generic (PLEG): container finished" podID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerID="4d796614541ee3148d22a5548d422272d88770bb2b3aae44ffc27f5c1f274ba3" exitCode=0 Dec 04 12:16:59 crc kubenswrapper[4760]: I1204 12:16:59.082117 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerDied","Data":"4d796614541ee3148d22a5548d422272d88770bb2b3aae44ffc27f5c1f274ba3"} Dec 04 12:16:59 crc kubenswrapper[4760]: I1204 12:16:59.082425 4760 scope.go:117] "RemoveContainer" containerID="49475bdc88c035a487264e70f091acd1ad6d14087479f7950da0ec1f9970ea00" Dec 04 12:17:03 crc kubenswrapper[4760]: I1204 12:17:03.380904 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:17:03 crc kubenswrapper[4760]: I1204 12:17:03.381279 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:17:03 crc kubenswrapper[4760]: I1204 12:17:03.381337 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:17:03 crc kubenswrapper[4760]: I1204 12:17:03.383129 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:17:03 crc kubenswrapper[4760]: I1204 12:17:03.383192 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683" gracePeriod=600 Dec 04 12:17:06 crc kubenswrapper[4760]: I1204 12:17:06.153694 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:06 crc kubenswrapper[4760]: I1204 12:17:06.153808 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:16 crc kubenswrapper[4760]: I1204 12:17:16.153744 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:16 crc kubenswrapper[4760]: I1204 12:17:16.154414 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:18 crc kubenswrapper[4760]: I1204 12:17:18.367724 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683" exitCode=0 Dec 04 12:17:18 crc kubenswrapper[4760]: I1204 12:17:18.367829 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683"} Dec 04 12:17:18 crc kubenswrapper[4760]: E1204 12:17:18.593792 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 12:17:18 crc kubenswrapper[4760]: E1204 12:17:18.594203 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65zzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pt6wf_openshift-marketplace(8ba63da0-8512-4c36-a755-beaa01a7007b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:18 crc kubenswrapper[4760]: E1204 12:17:18.596154 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pt6wf" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" Dec 04 12:17:24 crc kubenswrapper[4760]: E1204 12:17:24.792145 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pt6wf" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" Dec 04 12:17:24 crc kubenswrapper[4760]: E1204 12:17:24.864036 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 12:17:24 crc kubenswrapper[4760]: E1204 12:17:24.864295 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sq7j4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ql8tz_openshift-marketplace(78261f38-564b-4487-a29e-5edc6859825e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:24 crc kubenswrapper[4760]: E1204 12:17:24.865859 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ql8tz" podUID="78261f38-564b-4487-a29e-5edc6859825e" Dec 04 12:17:25 crc kubenswrapper[4760]: E1204 12:17:25.625064 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 12:17:25 crc kubenswrapper[4760]: E1204 12:17:25.625294 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h7kvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-lldkh_openshift-marketplace(5a77f5f7-b738-4cba-94ca-06643a4ad964): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:25 crc kubenswrapper[4760]: E1204 12:17:25.626534 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-lldkh" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" Dec 04 12:17:26 crc kubenswrapper[4760]: I1204 12:17:26.153575 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:26 crc kubenswrapper[4760]: I1204 12:17:26.153958 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.117657 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-lldkh" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.118238 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ql8tz" podUID="78261f38-564b-4487-a29e-5edc6859825e" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.194644 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.194831 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bhrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-l9t6h_openshift-marketplace(55ebb09b-1c59-4289-92f0-847b3c655fa9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.196062 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-l9t6h" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.577343 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.577496 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tz7mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bptn9_openshift-marketplace(a3906714-b46d-4640-be9f-d57ba0fd27bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.578638 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bptn9" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.852350 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.852547 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8rnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-g48rj_openshift-marketplace(2a7dec90-d501-40e2-9338-df345c0fd672): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:27 crc kubenswrapper[4760]: E1204 12:17:27.855051 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-g48rj" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" Dec 04 12:17:28 crc kubenswrapper[4760]: E1204 12:17:28.882114 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-l9t6h" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" Dec 04 12:17:28 crc kubenswrapper[4760]: E1204 12:17:28.882641 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bptn9" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" Dec 04 12:17:28 crc kubenswrapper[4760]: E1204 12:17:28.882790 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-g48rj" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.041372 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.041983 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9pv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-94pdb_openshift-marketplace(6ea41f2f-f148-4280-b046-1bea756a117a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.043345 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-94pdb" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.051010 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.051143 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lhnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rpnt8_openshift-marketplace(6d9a363c-3a21-44a8-aeb0-720692d8ee7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.052344 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rpnt8" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" Dec 04 12:17:29 crc kubenswrapper[4760]: I1204 12:17:29.232141 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 12:17:29 crc kubenswrapper[4760]: W1204 12:17:29.244739 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod89bf2276_6ff6_488b_9c55_b2c9ff8e00c3.slice/crio-465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad WatchSource:0}: Error finding container 465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad: Status 404 returned error can't find the container with id 465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad Dec 04 12:17:29 crc kubenswrapper[4760]: I1204 12:17:29.319129 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 12:17:29 crc kubenswrapper[4760]: W1204 12:17:29.340983 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8e14fed8_554b_47bd_8acd_47076f641fe2.slice/crio-22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6 WatchSource:0}: Error finding container 22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6: Status 404 returned error can't find the container with id 22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6 Dec 04 12:17:29 crc kubenswrapper[4760]: I1204 12:17:29.428240 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3","Type":"ContainerStarted","Data":"465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad"} Dec 04 12:17:29 crc kubenswrapper[4760]: I1204 12:17:29.432375 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8e14fed8-554b-47bd-8acd-47076f641fe2","Type":"ContainerStarted","Data":"22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6"} Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.434301 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-94pdb" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" Dec 04 12:17:29 crc kubenswrapper[4760]: E1204 12:17:29.434956 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rpnt8" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.442442 4760 generic.go:334] "Generic (PLEG): container finished" podID="89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" containerID="10d1fa3e4460bdd5478b5f22149849afc834257ca93cd247de9579d5c2b37ac0" exitCode=0 Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.444164 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3","Type":"ContainerDied","Data":"10d1fa3e4460bdd5478b5f22149849afc834257ca93cd247de9579d5c2b37ac0"} Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.453766 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dn2jn" event={"ID":"397d3069-2845-40f6-bbb9-d2541d0f3f80","Type":"ContainerStarted","Data":"1b0486db13c6bf47f3e0811f7010239c4eec12c7bae12554913481a35027ddff"} Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.454844 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.454951 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.455087 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.457165 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8e14fed8-554b-47bd-8acd-47076f641fe2","Type":"ContainerStarted","Data":"dea1e3ac7a5dbfb6a5a4c8440869f1881b2eaa8d3c60297633a53ee68f0a5aed"} Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.464063 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9"} Dec 04 12:17:30 crc kubenswrapper[4760]: I1204 12:17:30.522046 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=39.522008922 podStartE2EDuration="39.522008922s" podCreationTimestamp="2025-12-04 12:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:17:30.520417551 +0000 UTC m=+253.561864108" watchObservedRunningTime="2025-12-04 12:17:30.522008922 +0000 UTC m=+253.563455499" Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.471186 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.473515 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.841623 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.934386 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir\") pod \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.934444 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" (UID: "89bf2276-6ff6-488b-9c55-b2c9ff8e00c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.934640 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access\") pod \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\" (UID: \"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3\") " Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.935048 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:17:31 crc kubenswrapper[4760]: I1204 12:17:31.960626 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" (UID: "89bf2276-6ff6-488b-9c55-b2c9ff8e00c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.036777 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89bf2276-6ff6-488b-9c55-b2c9ff8e00c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.481111 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"89bf2276-6ff6-488b-9c55-b2c9ff8e00c3","Type":"ContainerDied","Data":"465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad"} Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.481505 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="465d9b41882633e656e48e784c56c26e71ee4f3bb74ff4eb0b28e5f6065623ad" Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.481141 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.481849 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:32 crc kubenswrapper[4760]: I1204 12:17:32.481907 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:36 crc kubenswrapper[4760]: I1204 12:17:36.153128 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:36 crc kubenswrapper[4760]: I1204 12:17:36.153362 4760 patch_prober.go:28] interesting pod/downloads-7954f5f757-dn2jn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 04 12:17:36 crc kubenswrapper[4760]: I1204 12:17:36.153744 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:36 crc kubenswrapper[4760]: I1204 12:17:36.153809 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dn2jn" podUID="397d3069-2845-40f6-bbb9-d2541d0f3f80" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.41:8080/\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 04 12:17:46 crc kubenswrapper[4760]: I1204 12:17:46.160811 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dn2jn" Dec 04 12:17:46 crc kubenswrapper[4760]: I1204 12:17:46.640765 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bmjcx"] Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.599503 4760 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.600887 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7" gracePeriod=15 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601000 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34" gracePeriod=15 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601005 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e" gracePeriod=15 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601120 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759" gracePeriod=15 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601180 4760 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601186 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597" gracePeriod=15 Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.601848 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601869 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.601880 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601886 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.601919 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601926 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.601937 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601946 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.601956 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.601969 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.602012 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" containerName="pruner" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602021 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" containerName="pruner" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.602034 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602040 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.602052 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602060 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602250 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602262 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602268 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602275 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602286 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602323 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="89bf2276-6ff6-488b-9c55-b2c9ff8e00c3" containerName="pruner" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602337 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602350 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: E1204 12:18:07.602518 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.602528 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.605195 4760 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.605995 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.616093 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.664715 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.708750 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.708892 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709036 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709127 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709157 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709324 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709360 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.709397 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.810989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811069 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811247 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811427 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811494 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811462 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811559 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811517 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811606 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811645 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811733 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811751 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811763 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.811912 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.869651 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:07.953661 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.707814 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.709491 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.710408 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34" exitCode=0 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.710445 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597" exitCode=0 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.710455 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759" exitCode=2 Dec 04 12:18:08 crc kubenswrapper[4760]: I1204 12:18:08.710503 4760 scope.go:117] "RemoveContainer" containerID="4a9bf7818272a22be1d45589ad88856bdd989a47830e82ba0724a1f480fac34f" Dec 04 12:18:09 crc kubenswrapper[4760]: I1204 12:18:09.720928 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 12:18:09 crc kubenswrapper[4760]: I1204 12:18:09.722085 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e" exitCode=0 Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.294748 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.297018 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.298795 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.300043 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.361850 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.361979 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362008 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362073 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362172 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362272 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362411 4760 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362424 4760 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.362437 4760 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:10 crc kubenswrapper[4760]: E1204 12:18:10.396825 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e025a32a10910 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,LastTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.733588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0e6fb011ef4a551862cd5c6848b2287ea557a463bf7f06bd7d07bb029238d5d1"} Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.743867 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.757893 4760 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7" exitCode=0 Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.758016 4760 scope.go:117] "RemoveContainer" containerID="160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.758082 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.798568 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.799151 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.802610 4760 scope.go:117] "RemoveContainer" containerID="e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.842220 4760 scope.go:117] "RemoveContainer" containerID="6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.879591 4760 scope.go:117] "RemoveContainer" containerID="35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.921034 4760 scope.go:117] "RemoveContainer" containerID="63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7" Dec 04 12:18:10 crc kubenswrapper[4760]: I1204 12:18:10.943891 4760 scope.go:117] "RemoveContainer" containerID="cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.995017 4760 scope.go:117] "RemoveContainer" containerID="160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.995752 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\": container with ID starting with 160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34 not found: ID does not exist" containerID="160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.995785 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34"} err="failed to get container status \"160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\": rpc error: code = NotFound desc = could not find container \"160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34\": container with ID starting with 160483d5ac7608cde849f7d04581d66e4f06dfd1b073876357c9a3f128f42b34 not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.995812 4760 scope.go:117] "RemoveContainer" containerID="e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.996634 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\": container with ID starting with e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597 not found: ID does not exist" containerID="e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.996658 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597"} err="failed to get container status \"e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\": rpc error: code = NotFound desc = could not find container \"e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597\": container with ID starting with e5063cacaa647e6ea6d6acd9284548d5ce25209a1549e670cbceefabd1caa597 not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.996679 4760 scope.go:117] "RemoveContainer" containerID="6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.997109 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\": container with ID starting with 6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e not found: ID does not exist" containerID="6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997136 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e"} err="failed to get container status \"6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\": rpc error: code = NotFound desc = could not find container \"6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e\": container with ID starting with 6e259a6c2d0f0a25b04ecbf51598140cbe3fcc651bec1a943f333aa9a17fd76e not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997155 4760 scope.go:117] "RemoveContainer" containerID="35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.997503 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\": container with ID starting with 35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759 not found: ID does not exist" containerID="35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997525 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759"} err="failed to get container status \"35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\": rpc error: code = NotFound desc = could not find container \"35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759\": container with ID starting with 35eed222a31cd8346bfb6f6de32c1df7ae5b77f42610efbea6c6a9f95c051759 not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997542 4760 scope.go:117] "RemoveContainer" containerID="63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.997868 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\": container with ID starting with 63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7 not found: ID does not exist" containerID="63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997902 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7"} err="failed to get container status \"63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\": rpc error: code = NotFound desc = could not find container \"63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7\": container with ID starting with 63f503eabfc675fabad251c6cf1dc58b8fb9118bd7d7746a417a0aaf7134daa7 not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.997924 4760 scope.go:117] "RemoveContainer" containerID="cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5" Dec 04 12:18:11 crc kubenswrapper[4760]: E1204 12:18:10.998301 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\": container with ID starting with cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5 not found: ID does not exist" containerID="cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:10.998321 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5"} err="failed to get container status \"cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\": rpc error: code = NotFound desc = could not find container \"cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5\": container with ID starting with cf45535911210b45f6ec8062d5d541ac0c546b27944f22c7ae4204570afe5da5 not found: ID does not exist" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.669761 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" containerName="oauth-openshift" containerID="cri-o://701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820" gracePeriod=15 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.774064 4760 generic.go:334] "Generic (PLEG): container finished" podID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerID="af43d776df0a39cf84907aad72f960bd96624ca3b91071e1448c373b108987c8" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.774179 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerDied","Data":"af43d776df0a39cf84907aad72f960bd96624ca3b91071e1448c373b108987c8"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.775703 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.776156 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.776816 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.777698 4760 generic.go:334] "Generic (PLEG): container finished" podID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerID="df4492b64a044a2dee0084a14130bd5a4054f7e69e2bd70b5d62c2e90b565eb8" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.777734 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerDied","Data":"df4492b64a044a2dee0084a14130bd5a4054f7e69e2bd70b5d62c2e90b565eb8"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.778813 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.779347 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.779745 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.780232 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.782395 4760 generic.go:334] "Generic (PLEG): container finished" podID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerID="68efd119a41a62a8cbf1f91b611b6691bf1d6d892aaf96f066be8d3861b6eadc" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.782470 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerDied","Data":"68efd119a41a62a8cbf1f91b611b6691bf1d6d892aaf96f066be8d3861b6eadc"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.783572 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.784001 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.787258 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.788182 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.788752 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.791271 4760 generic.go:334] "Generic (PLEG): container finished" podID="6ea41f2f-f148-4280-b046-1bea756a117a" containerID="ec6127f8fb88f77dc615526d2897a00457f626bf7cd8ddf707990c1f09207f0f" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.791367 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerDied","Data":"ec6127f8fb88f77dc615526d2897a00457f626bf7cd8ddf707990c1f09207f0f"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.792270 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.792798 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.793569 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.794164 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.794613 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.795079 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.799725 4760 generic.go:334] "Generic (PLEG): container finished" podID="78261f38-564b-4487-a29e-5edc6859825e" containerID="19b991656209bb1d3082c04e42be7b8a67a6db8c0e9ec39c04a018804db3bfa9" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.799829 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql8tz" event={"ID":"78261f38-564b-4487-a29e-5edc6859825e","Type":"ContainerDied","Data":"19b991656209bb1d3082c04e42be7b8a67a6db8c0e9ec39c04a018804db3bfa9"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.801802 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.803514 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerID="66ddaf89718a29f54023d398b58ca5628218dfded3deb61e3d36ac0ae15e41de" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.803657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerDied","Data":"66ddaf89718a29f54023d398b58ca5628218dfded3deb61e3d36ac0ae15e41de"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.802661 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.804726 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.805292 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.806127 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.806893 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.807386 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.807526 4760 generic.go:334] "Generic (PLEG): container finished" podID="2a7dec90-d501-40e2-9338-df345c0fd672" containerID="4beb0e3a73ad8f1fbbd17dc1589d6d7800cc9dc0a01e092755f743661b1b373a" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.807608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerDied","Data":"4beb0e3a73ad8f1fbbd17dc1589d6d7800cc9dc0a01e092755f743661b1b373a"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.807878 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.808090 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.808337 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.808635 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.809186 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.809635 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.809847 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.810646 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.812231 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.813182 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.813584 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.813824 4760 generic.go:334] "Generic (PLEG): container finished" podID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerID="693d2d713f8c1b4d0c2e6dae38ca865ee958799a3f86d0e8ed2fc1a8b6ddd341" exitCode=0 Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.813868 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.813915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerDied","Data":"693d2d713f8c1b4d0c2e6dae38ca865ee958799a3f86d0e8ed2fc1a8b6ddd341"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.814239 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.814717 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.815482 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.815748 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.816074 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.816742 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f9e315a94ae24f035145083f83669cda0bc2a4b28b24cc60ccdfa621ad94119e"} Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.816844 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.817343 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.817807 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.818082 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.818370 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.818589 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.818898 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.819261 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.820524 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.821074 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.821654 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.822034 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.822333 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.822568 4760 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.822908 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.823325 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.823662 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.825166 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.825477 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.825887 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:11 crc kubenswrapper[4760]: I1204 12:18:11.872600 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.172903 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.173852 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.174501 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.175200 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.175736 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.176157 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.176888 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.177276 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.177652 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.177996 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.178376 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296299 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296375 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296444 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296495 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296520 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296555 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpscn\" (UniqueName: \"kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296582 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296656 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296684 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296724 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296753 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296793 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296888 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle\") pod \"87218323-b321-4cd8-8da1-5fa8769eb3b0\" (UID: \"87218323-b321-4cd8-8da1-5fa8769eb3b0\") " Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.296902 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.297248 4760 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.297857 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.299459 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.299677 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.300689 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.309278 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.310730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn" (OuterVolumeSpecName: "kube-api-access-bpscn") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "kube-api-access-bpscn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.310743 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.311800 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.313325 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.314628 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.315474 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.315974 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.316437 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "87218323-b321-4cd8-8da1-5fa8769eb3b0" (UID: "87218323-b321-4cd8-8da1-5fa8769eb3b0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398087 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398165 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpscn\" (UniqueName: \"kubernetes.io/projected/87218323-b321-4cd8-8da1-5fa8769eb3b0-kube-api-access-bpscn\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398182 4760 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398198 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398227 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398242 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398256 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398270 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398285 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398298 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398313 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398325 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.398338 4760 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87218323-b321-4cd8-8da1-5fa8769eb3b0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.829829 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerStarted","Data":"39f80c3ebd54a9e7aa5bb05c5021175eded1ad09a82b361e7b5d5e123f92c125"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.830994 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.831456 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.831910 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.832282 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.832674 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833007 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833094 4760 generic.go:334] "Generic (PLEG): container finished" podID="87218323-b321-4cd8-8da1-5fa8769eb3b0" containerID="701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820" exitCode=0 Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" event={"ID":"87218323-b321-4cd8-8da1-5fa8769eb3b0","Type":"ContainerDied","Data":"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" event={"ID":"87218323-b321-4cd8-8da1-5fa8769eb3b0","Type":"ContainerDied","Data":"06b3fa21c4a50b028ef1278937efcebe92e6883edd59d72cb1df2bf83343d720"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833229 4760 scope.go:117] "RemoveContainer" containerID="701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833275 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833391 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833549 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.833846 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.834393 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.834776 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.835166 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.836813 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.837048 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.837383 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.837748 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.838443 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.843872 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.844199 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.844634 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.852309 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerStarted","Data":"119235e10def2886fe1fde6c3c4a419a48a4ba6457aa9becb2104eced2eedacc"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.853261 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.853475 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.853924 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.854378 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.854664 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.855137 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.856065 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.856732 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.857311 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.857501 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.857969 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerStarted","Data":"770340b81b8a1d50e739eaa57e985bd69f7933784df631cb0d8858e5dc4af17e"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.859995 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.860306 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.860523 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.860766 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.860967 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.861185 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.861461 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.861672 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.861874 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.862078 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.872446 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql8tz" event={"ID":"78261f38-564b-4487-a29e-5edc6859825e","Type":"ContainerStarted","Data":"4786b09a38b4f96970bb6e7f9f4a6e1d6d38032bff2f27c387bde3b30ec92ac0"} Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.874933 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.876504 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.876890 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.880300 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.881020 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.881604 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.881940 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.882635 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.883163 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.883857 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.894258 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.894971 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.896075 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.898264 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.899157 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.900686 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.902312 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.902968 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.906064 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.906873 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.927866 4760 scope.go:117] "RemoveContainer" containerID="701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820" Dec 04 12:18:12 crc kubenswrapper[4760]: E1204 12:18:12.928500 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820\": container with ID starting with 701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820 not found: ID does not exist" containerID="701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820" Dec 04 12:18:12 crc kubenswrapper[4760]: I1204 12:18:12.928528 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820"} err="failed to get container status \"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820\": rpc error: code = NotFound desc = could not find container \"701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820\": container with ID starting with 701bb7a91bde2ecdbd1d87b1040370b32dd1ed5df79a7973dcf68787f46f6820 not found: ID does not exist" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.236517 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.236569 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:18:13 crc kubenswrapper[4760]: E1204 12:18:13.606138 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e025a32a10910 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,LastTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.881771 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerStarted","Data":"99eaf56a85ea61a2e74d07b1d509e294eacc24a2a6765c773d534ec74aa0a9f9"} Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.883271 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.883696 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.884172 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.884321 4760 generic.go:334] "Generic (PLEG): container finished" podID="8e14fed8-554b-47bd-8acd-47076f641fe2" containerID="dea1e3ac7a5dbfb6a5a4c8440869f1881b2eaa8d3c60297633a53ee68f0a5aed" exitCode=0 Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.884412 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8e14fed8-554b-47bd-8acd-47076f641fe2","Type":"ContainerDied","Data":"dea1e3ac7a5dbfb6a5a4c8440869f1881b2eaa8d3c60297633a53ee68f0a5aed"} Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.884794 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.885104 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.886144 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.886586 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.886933 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.887194 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.887503 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.888364 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.888690 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerStarted","Data":"ce282d39b5518acbb22e2eb04d59aaae0755a087e85d28c3da1445d16f7c6927"} Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.888886 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.889167 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.889485 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.890143 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.890449 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.890670 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.890898 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.892880 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.893317 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerStarted","Data":"b7f999cf403f1b687eedc308b2a99c22bea2470a7b7979a8c448192ba607bcea"} Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.893814 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.894696 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.895666 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.895975 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.896307 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.897153 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.897493 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.899008 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.899601 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.900050 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.900420 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.900960 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.901718 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.905536 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerStarted","Data":"f10d201a5a66cefb4da0e3b8576648f2f82b084861d1cb345ef6f4c4cd1d3ac1"} Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.906571 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.907244 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.907707 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.907941 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.908136 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.908365 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.908575 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.908768 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.908974 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.909176 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:13 crc kubenswrapper[4760]: I1204 12:18:13.909393 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:14 crc kubenswrapper[4760]: I1204 12:18:14.433923 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ql8tz" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="registry-server" probeResult="failure" output=< Dec 04 12:18:14 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 12:18:14 crc kubenswrapper[4760]: > Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.259858 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.261129 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.262755 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.264524 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.265442 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.265770 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.265976 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.266166 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.266377 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.266912 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.267328 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.267570 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.342929 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock\") pod \"8e14fed8-554b-47bd-8acd-47076f641fe2\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.343178 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access\") pod \"8e14fed8-554b-47bd-8acd-47076f641fe2\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.343281 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir\") pod \"8e14fed8-554b-47bd-8acd-47076f641fe2\" (UID: \"8e14fed8-554b-47bd-8acd-47076f641fe2\") " Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.343821 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8e14fed8-554b-47bd-8acd-47076f641fe2" (UID: "8e14fed8-554b-47bd-8acd-47076f641fe2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.343888 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e14fed8-554b-47bd-8acd-47076f641fe2" (UID: "8e14fed8-554b-47bd-8acd-47076f641fe2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.351687 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e14fed8-554b-47bd-8acd-47076f641fe2" (UID: "8e14fed8-554b-47bd-8acd-47076f641fe2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.444548 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e14fed8-554b-47bd-8acd-47076f641fe2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.444596 4760 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.444609 4760 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e14fed8-554b-47bd-8acd-47076f641fe2-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.709431 4760 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.709993 4760 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.710237 4760 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.710458 4760 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.710706 4760 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.710736 4760 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.710958 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="200ms" Dec 04 12:18:15 crc kubenswrapper[4760]: E1204 12:18:15.913114 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="400ms" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.921937 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8e14fed8-554b-47bd-8acd-47076f641fe2","Type":"ContainerDied","Data":"22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6"} Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.921992 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22b04922bf245ad4616063b5295907d88369e35b8875bcd24070d58945ff0ac6" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.922138 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.928966 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.930124 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.931175 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.931598 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.934255 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.934873 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.935085 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.935271 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.935438 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.935851 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:15 crc kubenswrapper[4760]: I1204 12:18:15.936413 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.314854 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="800ms" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.580694 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.582394 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.582905 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.583572 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.584424 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:16 crc kubenswrapper[4760]: E1204 12:18:16.584466 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:18:17 crc kubenswrapper[4760]: E1204 12:18:17.115691 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="1.6s" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.727875 4760 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.867171 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.867547 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.868814 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.869161 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.869531 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.869759 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.870050 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.870424 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.871408 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.872187 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:17 crc kubenswrapper[4760]: I1204 12:18:17.872883 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:18 crc kubenswrapper[4760]: E1204 12:18:18.716603 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="3.2s" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.308471 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.308830 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.365198 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.365926 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.366496 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.367060 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.367358 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.367926 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.368962 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.369199 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.369994 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.370272 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.370709 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.371291 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.494785 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.494832 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.534916 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.535593 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.536034 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.536398 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.536781 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.537220 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.537425 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.537627 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.537872 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.538204 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.538417 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.538594 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.755995 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.756064 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.796559 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.797338 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.797689 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.798150 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.798460 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.798739 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.799038 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.799355 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.799629 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.799955 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.800255 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.800551 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.856426 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.856536 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.898476 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.899587 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.900039 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.900228 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.900576 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.900942 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.901268 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.901817 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.902362 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.902750 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.903128 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.903511 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.996927 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.997017 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.997057 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.998031 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.998490 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.998767 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.999051 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.999085 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.999355 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:19 crc kubenswrapper[4760]: I1204 12:18:19.999708 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:19.999973 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.000225 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.001373 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.001637 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.001995 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.002506 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.002743 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.003173 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.003498 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.003725 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.004068 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.004386 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.004676 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.004945 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.005515 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:20 crc kubenswrapper[4760]: I1204 12:18:20.006248 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.687669 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.688094 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.734478 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.735503 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.736450 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.737112 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.737527 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.737923 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.738305 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.738666 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.739056 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.739444 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.739811 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.740397 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.863575 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.864878 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.865588 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.866241 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.866625 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.867081 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.867533 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.868562 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.869232 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.869614 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.870172 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.870651 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.882640 4760 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.882683 4760 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:21 crc kubenswrapper[4760]: E1204 12:18:21.883201 4760 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.883777 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:21 crc kubenswrapper[4760]: E1204 12:18:21.918131 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="6.4s" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.942615 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.942704 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.965950 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.966030 4760 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64" exitCode=1 Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.966194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64"} Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.967628 4760 scope.go:117] "RemoveContainer" containerID="cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.968043 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.968601 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1ef8a8e923df5dd9593f99322d856223350364b9bc425100f20d526c577d050e"} Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.968745 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.969466 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.970002 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.970784 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.971834 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.972523 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.972699 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.972858 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.973050 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.973206 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.973378 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.973652 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:18:21 crc kubenswrapper[4760]: I1204 12:18:21.973682 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.023956 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.024810 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.025558 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.026171 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.026459 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.026726 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.027105 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.027404 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.027482 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.027842 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.028300 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.028908 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.029246 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.029667 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.030287 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.030703 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.031072 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.031637 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.032000 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.032459 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.032897 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.033302 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.034366 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.034802 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.035116 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.035834 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.311387 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.311765 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.351051 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.351853 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.352314 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.352631 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.353043 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.353527 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.353883 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.354166 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.354736 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.355548 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.355856 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.356168 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:22 crc kubenswrapper[4760]: I1204 12:18:22.356507 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.022942 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.024085 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.024460 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.024742 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.024987 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.025250 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.025514 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.025768 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.026028 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.026347 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.026585 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.026633 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.026882 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.027193 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.027594 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.027829 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.028051 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.028390 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.028594 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.028766 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.028965 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.029222 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.029477 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.029997 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.030221 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.030452 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.281259 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.282395 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.283444 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.284144 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.285002 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.285488 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.285884 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.288686 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.289478 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.290686 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.291639 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.292118 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.292627 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.323833 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.324591 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.325106 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.325367 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.325660 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.325979 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.326205 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.326454 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.326689 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.326960 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.327325 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.327574 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: I1204 12:18:23.327855 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:23 crc kubenswrapper[4760]: E1204 12:18:23.607830 4760 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e025a32a10910 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,LastTimestamp:2025-12-04 12:18:10.395539728 +0000 UTC m=+293.436986295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 12:18:25 crc kubenswrapper[4760]: I1204 12:18:25.131131 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:18:26 crc kubenswrapper[4760]: I1204 12:18:26.002502 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fc7b0f1d18f451f0ea6782751575ac6936347eee6ebf84e9d4ee79229ed66536"} Dec 04 12:18:26 crc kubenswrapper[4760]: I1204 12:18:26.006319 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 12:18:26 crc kubenswrapper[4760]: I1204 12:18:26.006388 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5df0a5a1166a7b104a1b63808cf584bbfaf8f0f03f35b2d6ad3bbcd8dc811ee8"} Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.826404 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:26Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:26Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:26Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T12:18:26Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[],\\\"sizeBytes\\\":1610175307},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.827506 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.828049 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.828610 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.829093 4760 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:26 crc kubenswrapper[4760]: E1204 12:18:26.829128 4760 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.014808 4760 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fc7b0f1d18f451f0ea6782751575ac6936347eee6ebf84e9d4ee79229ed66536" exitCode=0 Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.014935 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fc7b0f1d18f451f0ea6782751575ac6936347eee6ebf84e9d4ee79229ed66536"} Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.015710 4760 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.015732 4760 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:27 crc kubenswrapper[4760]: E1204 12:18:27.015999 4760 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.015984 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.016490 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.016896 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.017562 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.017845 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.018105 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.018472 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.018847 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.019304 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.019664 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.019957 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.020307 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.020787 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.021074 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.021405 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.021633 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.021847 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.022018 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.022170 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.022383 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.022621 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.022859 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.023272 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.023535 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.870355 4760 status_manager.go:851] "Failed to get status for pod" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" pod="openshift-marketplace/community-operators-rpnt8" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-rpnt8\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.871259 4760 status_manager.go:851] "Failed to get status for pod" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" pod="openshift-marketplace/certified-operators-l9t6h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-l9t6h\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.871612 4760 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.872067 4760 status_manager.go:851] "Failed to get status for pod" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" pod="openshift-marketplace/redhat-operators-pt6wf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt6wf\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.872366 4760 status_manager.go:851] "Failed to get status for pod" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.872623 4760 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.873018 4760 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.873370 4760 status_manager.go:851] "Failed to get status for pod" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" pod="openshift-marketplace/certified-operators-bptn9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-bptn9\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.873633 4760 status_manager.go:851] "Failed to get status for pod" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" pod="openshift-authentication/oauth-openshift-558db77b4-bmjcx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-bmjcx\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.873876 4760 status_manager.go:851] "Failed to get status for pod" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" pod="openshift-marketplace/redhat-marketplace-g48rj" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-g48rj\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.874501 4760 status_manager.go:851] "Failed to get status for pod" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" pod="openshift-marketplace/community-operators-94pdb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-94pdb\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.874759 4760 status_manager.go:851] "Failed to get status for pod" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" pod="openshift-marketplace/redhat-marketplace-lldkh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-lldkh\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:27 crc kubenswrapper[4760]: I1204 12:18:27.875038 4760 status_manager.go:851] "Failed to get status for pod" podUID="78261f38-564b-4487-a29e-5edc6859825e" pod="openshift-marketplace/redhat-operators-ql8tz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ql8tz\": dial tcp 38.102.83.107:6443: connect: connection refused" Dec 04 12:18:28 crc kubenswrapper[4760]: E1204 12:18:28.319272 4760 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="7s" Dec 04 12:18:30 crc kubenswrapper[4760]: I1204 12:18:30.041322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25dc2813b6fd52f8164fdac44f77d266f33d3d8fa5bca7784b397001e4f78e1c"} Dec 04 12:18:31 crc kubenswrapper[4760]: I1204 12:18:31.059545 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"492b2b4924ec36c0f0677ddedcc54482d9101f022b36ffa4b8dacc45c63b4533"} Dec 04 12:18:31 crc kubenswrapper[4760]: I1204 12:18:31.059812 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a39ad2fc83d373f15469b5710b27fb83e70d7b39d76a3afb1096f90e98191b85"} Dec 04 12:18:31 crc kubenswrapper[4760]: I1204 12:18:31.940579 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:18:32 crc kubenswrapper[4760]: I1204 12:18:32.140199 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7869668c72dfbcc7e9d8df397f0851e697e46de801e2be326eeaea39cb8839b3"} Dec 04 12:18:32 crc kubenswrapper[4760]: I1204 12:18:32.140278 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"60d57aba643c5da7e4c51855b3d4560d15a6b615637493d0831270dc44d1f977"} Dec 04 12:18:32 crc kubenswrapper[4760]: I1204 12:18:32.140586 4760 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:32 crc kubenswrapper[4760]: I1204 12:18:32.140602 4760 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:32 crc kubenswrapper[4760]: I1204 12:18:32.140771 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:35 crc kubenswrapper[4760]: I1204 12:18:35.131299 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:18:35 crc kubenswrapper[4760]: I1204 12:18:35.131459 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 12:18:35 crc kubenswrapper[4760]: I1204 12:18:35.131844 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 12:18:36 crc kubenswrapper[4760]: I1204 12:18:36.884454 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:36 crc kubenswrapper[4760]: I1204 12:18:36.884538 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:36 crc kubenswrapper[4760]: I1204 12:18:36.891328 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:37 crc kubenswrapper[4760]: I1204 12:18:37.155103 4760 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:18:37 crc kubenswrapper[4760]: I1204 12:18:37.196368 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eed78284-2990-4cfe-ad98-2d96acf07afb" Dec 04 12:18:37 crc kubenswrapper[4760]: I1204 12:18:37.241883 4760 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:37 crc kubenswrapper[4760]: I1204 12:18:37.241927 4760 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5499d924-81bc-4bd7-8148-1fd816851d20" Dec 04 12:18:37 crc kubenswrapper[4760]: I1204 12:18:37.436102 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eed78284-2990-4cfe-ad98-2d96acf07afb" Dec 04 12:18:45 crc kubenswrapper[4760]: I1204 12:18:45.131680 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 12:18:45 crc kubenswrapper[4760]: I1204 12:18:45.132282 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 12:18:49 crc kubenswrapper[4760]: I1204 12:18:49.735705 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 12:18:49 crc kubenswrapper[4760]: I1204 12:18:49.971286 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 12:18:50 crc kubenswrapper[4760]: I1204 12:18:50.736563 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 12:18:50 crc kubenswrapper[4760]: I1204 12:18:50.924159 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 12:18:51 crc kubenswrapper[4760]: I1204 12:18:51.079861 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 12:18:51 crc kubenswrapper[4760]: I1204 12:18:51.760082 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 12:18:51 crc kubenswrapper[4760]: I1204 12:18:51.850811 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 12:18:52 crc kubenswrapper[4760]: I1204 12:18:52.491159 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 12:18:52 crc kubenswrapper[4760]: I1204 12:18:52.822706 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 12:18:52 crc kubenswrapper[4760]: I1204 12:18:52.897320 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 12:18:53 crc kubenswrapper[4760]: I1204 12:18:53.064882 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 12:18:53 crc kubenswrapper[4760]: I1204 12:18:53.103575 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 12:18:53 crc kubenswrapper[4760]: I1204 12:18:53.123874 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 12:18:53 crc kubenswrapper[4760]: I1204 12:18:53.388300 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 12:18:53 crc kubenswrapper[4760]: I1204 12:18:53.637236 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 12:18:54 crc kubenswrapper[4760]: I1204 12:18:54.145586 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 12:18:54 crc kubenswrapper[4760]: I1204 12:18:54.576949 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 12:18:54 crc kubenswrapper[4760]: I1204 12:18:54.995668 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.041466 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.132582 4760 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.132673 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.132758 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.133646 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"5df0a5a1166a7b104a1b63808cf584bbfaf8f0f03f35b2d6ad3bbcd8dc811ee8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.133816 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://5df0a5a1166a7b104a1b63808cf584bbfaf8f0f03f35b2d6ad3bbcd8dc811ee8" gracePeriod=30 Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.407113 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 12:18:55 crc kubenswrapper[4760]: I1204 12:18:55.564076 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.301734 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.585856 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.586911 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.601633 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.725853 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 12:18:56 crc kubenswrapper[4760]: I1204 12:18:56.926788 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 12:18:57 crc kubenswrapper[4760]: I1204 12:18:57.236620 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 12:18:57 crc kubenswrapper[4760]: I1204 12:18:57.488756 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 12:18:57 crc kubenswrapper[4760]: I1204 12:18:57.658923 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 12:18:58 crc kubenswrapper[4760]: I1204 12:18:58.037277 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 12:18:58 crc kubenswrapper[4760]: I1204 12:18:58.494835 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 12:18:58 crc kubenswrapper[4760]: I1204 12:18:58.512343 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 12:18:59 crc kubenswrapper[4760]: I1204 12:18:59.816411 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 12:19:00 crc kubenswrapper[4760]: I1204 12:19:00.229146 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 12:19:00 crc kubenswrapper[4760]: I1204 12:19:00.478629 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 12:19:00 crc kubenswrapper[4760]: I1204 12:19:00.691331 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 12:19:00 crc kubenswrapper[4760]: I1204 12:19:00.848108 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 12:19:01 crc kubenswrapper[4760]: I1204 12:19:01.395523 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 12:19:02 crc kubenswrapper[4760]: I1204 12:19:02.744492 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 12:19:04 crc kubenswrapper[4760]: I1204 12:19:04.978024 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 12:19:07 crc kubenswrapper[4760]: I1204 12:19:07.221944 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 12:19:08 crc kubenswrapper[4760]: I1204 12:19:08.053557 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 12:19:08 crc kubenswrapper[4760]: I1204 12:19:08.094632 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 12:19:08 crc kubenswrapper[4760]: I1204 12:19:08.474693 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 12:19:08 crc kubenswrapper[4760]: I1204 12:19:08.597526 4760 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 12:19:08 crc kubenswrapper[4760]: I1204 12:19:08.930838 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 12:19:09 crc kubenswrapper[4760]: I1204 12:19:09.035405 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 12:19:09 crc kubenswrapper[4760]: I1204 12:19:09.047503 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 12:19:09 crc kubenswrapper[4760]: I1204 12:19:09.893662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 12:19:09 crc kubenswrapper[4760]: I1204 12:19:09.904403 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 12:19:10 crc kubenswrapper[4760]: I1204 12:19:10.353361 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 12:19:11 crc kubenswrapper[4760]: I1204 12:19:11.057004 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 12:19:11 crc kubenswrapper[4760]: I1204 12:19:11.591875 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 12:19:11 crc kubenswrapper[4760]: I1204 12:19:11.692800 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 12:19:11 crc kubenswrapper[4760]: I1204 12:19:11.757548 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 12:19:11 crc kubenswrapper[4760]: I1204 12:19:11.992434 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 12:19:12 crc kubenswrapper[4760]: I1204 12:19:12.131654 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 12:19:12 crc kubenswrapper[4760]: I1204 12:19:12.145985 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 12:19:12 crc kubenswrapper[4760]: I1204 12:19:12.381859 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 12:19:12 crc kubenswrapper[4760]: I1204 12:19:12.682464 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 12:19:12 crc kubenswrapper[4760]: I1204 12:19:12.989083 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 12:19:13 crc kubenswrapper[4760]: I1204 12:19:13.771905 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 12:19:14 crc kubenswrapper[4760]: I1204 12:19:14.250561 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.114304 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.118915 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.280378 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.471561 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.523410 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 12:19:15 crc kubenswrapper[4760]: I1204 12:19:15.884346 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.352807 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.490759 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.540125 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.569609 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.601435 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.684538 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.714068 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 12:19:16 crc kubenswrapper[4760]: I1204 12:19:16.915543 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.000903 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.261325 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.332703 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.380564 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.742726 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 12:19:17 crc kubenswrapper[4760]: I1204 12:19:17.789448 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 12:19:18 crc kubenswrapper[4760]: I1204 12:19:18.179432 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 12:19:18 crc kubenswrapper[4760]: I1204 12:19:18.211176 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.348920 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.349821 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.371808 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.384830 4760 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.559796 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.682543 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.700969 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.833726 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.837143 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.992911 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 12:19:19 crc kubenswrapper[4760]: I1204 12:19:19.998329 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.064259 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.123275 4760 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.124429 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bptn9" podStartSLOduration=70.34375877 podStartE2EDuration="3m11.124408191s" podCreationTimestamp="2025-12-04 12:16:09 +0000 UTC" firstStartedPulling="2025-12-04 12:16:11.796373273 +0000 UTC m=+174.837819830" lastFinishedPulling="2025-12-04 12:18:12.577022684 +0000 UTC m=+295.618469251" observedRunningTime="2025-12-04 12:18:37.296364269 +0000 UTC m=+320.337810836" watchObservedRunningTime="2025-12-04 12:19:20.124408191 +0000 UTC m=+363.165854758" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.125063 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rpnt8" podStartSLOduration=71.336976167 podStartE2EDuration="3m11.125055592s" podCreationTimestamp="2025-12-04 12:16:09 +0000 UTC" firstStartedPulling="2025-12-04 12:16:12.995672452 +0000 UTC m=+176.037119019" lastFinishedPulling="2025-12-04 12:18:12.783751867 +0000 UTC m=+295.825198444" observedRunningTime="2025-12-04 12:18:37.176861118 +0000 UTC m=+320.218307685" watchObservedRunningTime="2025-12-04 12:19:20.125055592 +0000 UTC m=+363.166502169" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.125424 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l9t6h" podStartSLOduration=70.443963401 podStartE2EDuration="3m11.125417654s" podCreationTimestamp="2025-12-04 12:16:09 +0000 UTC" firstStartedPulling="2025-12-04 12:16:11.811945404 +0000 UTC m=+174.853391971" lastFinishedPulling="2025-12-04 12:18:12.493399657 +0000 UTC m=+295.534846224" observedRunningTime="2025-12-04 12:18:37.192984182 +0000 UTC m=+320.234430749" watchObservedRunningTime="2025-12-04 12:19:20.125417654 +0000 UTC m=+363.166864221" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.125516 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94pdb" podStartSLOduration=71.310284187 podStartE2EDuration="3m12.125512187s" podCreationTimestamp="2025-12-04 12:16:08 +0000 UTC" firstStartedPulling="2025-12-04 12:16:11.812557163 +0000 UTC m=+174.854003730" lastFinishedPulling="2025-12-04 12:18:12.627785163 +0000 UTC m=+295.669231730" observedRunningTime="2025-12-04 12:18:37.403469406 +0000 UTC m=+320.444915963" watchObservedRunningTime="2025-12-04 12:19:20.125512187 +0000 UTC m=+363.166958754" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.125867 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lldkh" podStartSLOduration=72.819023763 podStartE2EDuration="3m9.125863027s" podCreationTimestamp="2025-12-04 12:16:11 +0000 UTC" firstStartedPulling="2025-12-04 12:16:16.688435259 +0000 UTC m=+179.729881826" lastFinishedPulling="2025-12-04 12:18:12.995274523 +0000 UTC m=+296.036721090" observedRunningTime="2025-12-04 12:18:37.43212753 +0000 UTC m=+320.473574097" watchObservedRunningTime="2025-12-04 12:19:20.125863027 +0000 UTC m=+363.167309594" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.125958 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pt6wf" podStartSLOduration=71.729583348 podStartE2EDuration="3m8.12595472s" podCreationTimestamp="2025-12-04 12:16:12 +0000 UTC" firstStartedPulling="2025-12-04 12:16:16.698268101 +0000 UTC m=+179.739714668" lastFinishedPulling="2025-12-04 12:18:13.094639473 +0000 UTC m=+296.136086040" observedRunningTime="2025-12-04 12:18:37.213065483 +0000 UTC m=+320.254512050" watchObservedRunningTime="2025-12-04 12:19:20.12595472 +0000 UTC m=+363.167401287" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.126039 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ql8tz" podStartSLOduration=74.581176676 podStartE2EDuration="3m8.126035513s" podCreationTimestamp="2025-12-04 12:16:12 +0000 UTC" firstStartedPulling="2025-12-04 12:16:18.781248714 +0000 UTC m=+181.822695281" lastFinishedPulling="2025-12-04 12:18:12.326107551 +0000 UTC m=+295.367554118" observedRunningTime="2025-12-04 12:18:37.377932901 +0000 UTC m=+320.419379478" watchObservedRunningTime="2025-12-04 12:19:20.126035513 +0000 UTC m=+363.167482080" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.127169 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=73.127159389 podStartE2EDuration="1m13.127159389s" podCreationTimestamp="2025-12-04 12:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:18:37.242370337 +0000 UTC m=+320.283816904" watchObservedRunningTime="2025-12-04 12:19:20.127159389 +0000 UTC m=+363.168605976" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.127366 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g48rj" podStartSLOduration=70.455204909 podStartE2EDuration="3m9.127361205s" podCreationTimestamp="2025-12-04 12:16:11 +0000 UTC" firstStartedPulling="2025-12-04 12:16:14.029514163 +0000 UTC m=+177.070960740" lastFinishedPulling="2025-12-04 12:18:12.701670469 +0000 UTC m=+295.743117036" observedRunningTime="2025-12-04 12:18:37.356511198 +0000 UTC m=+320.397957785" watchObservedRunningTime="2025-12-04 12:19:20.127361205 +0000 UTC m=+363.168807772" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.129582 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-bmjcx"] Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.129649 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.134562 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.134865 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.157128 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=43.157106916 podStartE2EDuration="43.157106916s" podCreationTimestamp="2025-12-04 12:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:19:20.150389204 +0000 UTC m=+363.191835781" watchObservedRunningTime="2025-12-04 12:19:20.157106916 +0000 UTC m=+363.198553503" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.238144 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.254167 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.280624 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.315940 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.572012 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.620279 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.761602 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.895396 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.957312 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 12:19:20 crc kubenswrapper[4760]: I1204 12:19:20.979185 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.075852 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.100295 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.299801 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.382706 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.475962 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.547924 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.959480 4760 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.960438 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 12:19:21 crc kubenswrapper[4760]: I1204 12:19:21.969318 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" path="/var/lib/kubelet/pods/87218323-b321-4cd8-8da1-5fa8769eb3b0/volumes" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.092081 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.099366 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.122435 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.150690 4760 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.150983 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f9e315a94ae24f035145083f83669cda0bc2a4b28b24cc60ccdfa621ad94119e" gracePeriod=5 Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.316935 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.502143 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.869224 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.883586 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 12:19:22 crc kubenswrapper[4760]: I1204 12:19:22.993478 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.003370 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.149178 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.253516 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.276526 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.326036 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.351158 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.404337 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.418099 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.554682 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.586295 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.632767 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.810575 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.822286 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 12:19:23 crc kubenswrapper[4760]: I1204 12:19:23.870652 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.061539 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.178742 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.254415 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.259165 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.319869 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.347766 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.420782 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.670061 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.754886 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.919096 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 12:19:24 crc kubenswrapper[4760]: I1204 12:19:24.926290 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.035202 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.045328 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.396485 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.602189 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.604445 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.604537 4760 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5df0a5a1166a7b104a1b63808cf584bbfaf8f0f03f35b2d6ad3bbcd8dc811ee8" exitCode=137 Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.604614 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5df0a5a1166a7b104a1b63808cf584bbfaf8f0f03f35b2d6ad3bbcd8dc811ee8"} Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.604678 4760 scope.go:117] "RemoveContainer" containerID="cfe917228a2b94f35e3443f857af11616a8282eed0319a62f697d40494bbbe64" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.811577 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 12:19:25 crc kubenswrapper[4760]: I1204 12:19:25.896156 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.006365 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.083829 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.327451 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.384569 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.464847 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.618985 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.620586 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"948d11d04e019aefe44b6dc9c4d46abfd5d72a4cf29994d62fb5643f458e89d6"} Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.691654 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.692020 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.754283 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.877159 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 12:19:26 crc kubenswrapper[4760]: I1204 12:19:26.932646 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.041297 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.047155 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.151055 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.267651 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.307944 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.561859 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.628624 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.628676 4760 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f9e315a94ae24f035145083f83669cda0bc2a4b28b24cc60ccdfa621ad94119e" exitCode=137 Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.744787 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.866475 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.866831 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.870623 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.872265 4760 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.883911 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.883942 4760 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3dc1c9ec-89b2-49cb-8a44-6c678ef77fef" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.888994 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.889288 4760 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3dc1c9ec-89b2-49cb-8a44-6c678ef77fef" Dec 04 12:19:27 crc kubenswrapper[4760]: I1204 12:19:27.889707 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061026 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061304 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061344 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061378 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061415 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061405 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061464 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061514 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061605 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061842 4760 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061868 4760 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061883 4760 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.061896 4760 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.069567 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.093410 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.162917 4760 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.198715 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.635498 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.635570 4760 scope.go:117] "RemoveContainer" containerID="f9e315a94ae24f035145083f83669cda0bc2a4b28b24cc60ccdfa621ad94119e" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.635665 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.792537 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 12:19:28 crc kubenswrapper[4760]: I1204 12:19:28.931587 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.070569 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.100386 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.381124 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.447333 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.489439 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.490583 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.751531 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.872485 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.874603 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 12:19:29 crc kubenswrapper[4760]: I1204 12:19:29.995133 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.128075 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.250904 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.277568 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.303201 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.347406 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.604187 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.704996 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 12:19:30 crc kubenswrapper[4760]: I1204 12:19:30.908110 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.198712 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.240844 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.289228 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.339285 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.362009 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.523055 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.693448 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.756617 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 12:19:31 crc kubenswrapper[4760]: I1204 12:19:31.940540 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.110789 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.111350 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.353136 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.414819 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.608540 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 12:19:32 crc kubenswrapper[4760]: I1204 12:19:32.889313 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.150947 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.243967 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.264427 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.380964 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.381036 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.531695 4760 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.572899 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.576359 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.645601 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.652851 4760 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.721975 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.811804 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 12:19:33 crc kubenswrapper[4760]: I1204 12:19:33.910475 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.269611 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.332698 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.517331 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595231 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-lv7jp"] Dec 04 12:19:34 crc kubenswrapper[4760]: E1204 12:19:34.595577 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595597 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 12:19:34 crc kubenswrapper[4760]: E1204 12:19:34.595606 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" containerName="installer" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595614 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" containerName="installer" Dec 04 12:19:34 crc kubenswrapper[4760]: E1204 12:19:34.595642 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" containerName="oauth-openshift" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595648 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" containerName="oauth-openshift" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595770 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e14fed8-554b-47bd-8acd-47076f641fe2" containerName="installer" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595790 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.595803 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="87218323-b321-4cd8-8da1-5fa8769eb3b0" containerName="oauth-openshift" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.596588 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.601863 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602137 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602188 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602605 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602320 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602803 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602851 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602946 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.603108 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.602554 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.603368 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.604329 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.632168 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-lv7jp"] Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.633199 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.645798 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.650627 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.748526 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.748592 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.748623 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.748831 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmtdc\" (UniqueName: \"kubernetes.io/projected/df22b449-9c3c-4d16-98dd-c8b87c88c162-kube-api-access-rmtdc\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.748982 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749031 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-dir\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749062 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-policies\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749184 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749298 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749344 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749368 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749454 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749553 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.749576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.849996 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850061 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-dir\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850087 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-policies\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850112 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850130 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850148 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850164 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850187 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850232 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850302 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.850960 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-service-ca\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.851098 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-policies\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.851110 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.851186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/df22b449-9c3c-4d16-98dd-c8b87c88c162-audit-dir\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.851239 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.851313 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmtdc\" (UniqueName: \"kubernetes.io/projected/df22b449-9c3c-4d16-98dd-c8b87c88c162-kube-api-access-rmtdc\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.852549 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.853569 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.858097 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-error\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.858924 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-session\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.859718 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.860247 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.860358 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.860871 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-user-template-login\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.861629 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.861669 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/df22b449-9c3c-4d16-98dd-c8b87c88c162-v4-0-config-system-router-certs\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.870938 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmtdc\" (UniqueName: \"kubernetes.io/projected/df22b449-9c3c-4d16-98dd-c8b87c88c162-kube-api-access-rmtdc\") pod \"oauth-openshift-6fffd54687-lv7jp\" (UID: \"df22b449-9c3c-4d16-98dd-c8b87c88c162\") " pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:34 crc kubenswrapper[4760]: I1204 12:19:34.935036 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.076782 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.131346 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.137704 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.277564 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.391967 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6fffd54687-lv7jp"] Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.681084 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" event={"ID":"df22b449-9c3c-4d16-98dd-c8b87c88c162","Type":"ContainerStarted","Data":"fd41ab56abd0eb67ebb1c266ddafa4780940aaf3c83d983f02727c0f5b878fbe"} Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.685423 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.854365 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 12:19:35 crc kubenswrapper[4760]: I1204 12:19:35.937604 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.354198 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.690336 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6fffd54687-lv7jp_df22b449-9c3c-4d16-98dd-c8b87c88c162/oauth-openshift/0.log" Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.690400 4760 generic.go:334] "Generic (PLEG): container finished" podID="df22b449-9c3c-4d16-98dd-c8b87c88c162" containerID="96d3b5f4399642fefa059ee2ef75337a7484b2585fbc91cd2d561af2aaf91ea9" exitCode=255 Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.690486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" event={"ID":"df22b449-9c3c-4d16-98dd-c8b87c88c162","Type":"ContainerDied","Data":"96d3b5f4399642fefa059ee2ef75337a7484b2585fbc91cd2d561af2aaf91ea9"} Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.691470 4760 scope.go:117] "RemoveContainer" containerID="96d3b5f4399642fefa059ee2ef75337a7484b2585fbc91cd2d561af2aaf91ea9" Dec 04 12:19:36 crc kubenswrapper[4760]: I1204 12:19:36.810409 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.699965 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6fffd54687-lv7jp_df22b449-9c3c-4d16-98dd-c8b87c88c162/oauth-openshift/0.log" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.700078 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" event={"ID":"df22b449-9c3c-4d16-98dd-c8b87c88c162","Type":"ContainerStarted","Data":"a1d3edb4a7bfc4704ce8edaad97c284cb3b880103a27dc098ef3f9f4b5f48b5b"} Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.701026 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.702573 4760 generic.go:334] "Generic (PLEG): container finished" podID="2262a901-d392-434b-bd32-43555b67f428" containerID="d1417a77a6420bb4709d113547dfa6fd8bcfa406b64291604746a1c6221f68c2" exitCode=0 Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.702644 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerDied","Data":"d1417a77a6420bb4709d113547dfa6fd8bcfa406b64291604746a1c6221f68c2"} Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.703405 4760 scope.go:117] "RemoveContainer" containerID="d1417a77a6420bb4709d113547dfa6fd8bcfa406b64291604746a1c6221f68c2" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.713927 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.733367 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6fffd54687-lv7jp" podStartSLOduration=111.733347871 podStartE2EDuration="1m51.733347871s" podCreationTimestamp="2025-12-04 12:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:19:37.727557608 +0000 UTC m=+380.769004175" watchObservedRunningTime="2025-12-04 12:19:37.733347871 +0000 UTC m=+380.774794438" Dec 04 12:19:37 crc kubenswrapper[4760]: I1204 12:19:37.911881 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 12:19:38 crc kubenswrapper[4760]: I1204 12:19:38.389432 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 12:19:38 crc kubenswrapper[4760]: I1204 12:19:38.716992 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerStarted","Data":"576ce8d59675149466424ee55ad9092e3f14024e00aba17735a11ccc3cfcb3f8"} Dec 04 12:19:38 crc kubenswrapper[4760]: I1204 12:19:38.718459 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:19:38 crc kubenswrapper[4760]: I1204 12:19:38.719955 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:19:39 crc kubenswrapper[4760]: I1204 12:19:39.290170 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 12:19:39 crc kubenswrapper[4760]: I1204 12:19:39.737176 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 12:19:42 crc kubenswrapper[4760]: I1204 12:19:42.815725 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 12:19:42 crc kubenswrapper[4760]: I1204 12:19:42.910331 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 12:19:42 crc kubenswrapper[4760]: I1204 12:19:42.964933 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 12:19:43 crc kubenswrapper[4760]: I1204 12:19:43.011080 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 12:20:03 crc kubenswrapper[4760]: I1204 12:20:03.380334 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:20:03 crc kubenswrapper[4760]: I1204 12:20:03.380966 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:20:18 crc kubenswrapper[4760]: I1204 12:20:18.879000 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:20:18 crc kubenswrapper[4760]: I1204 12:20:18.879873 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" podUID="60c1d932-093d-416b-9c58-88ff3d559656" containerName="controller-manager" containerID="cri-o://7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6" gracePeriod=30 Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.017866 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.018478 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" podUID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" containerName="route-controller-manager" containerID="cri-o://bdb449ea01963bcdd046864279afc24ae036c935c568c8e58ee02d511917100e" gracePeriod=30 Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.716768 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.727245 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2mn\" (UniqueName: \"kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn\") pod \"60c1d932-093d-416b-9c58-88ff3d559656\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.727455 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config\") pod \"60c1d932-093d-416b-9c58-88ff3d559656\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.727814 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert\") pod \"60c1d932-093d-416b-9c58-88ff3d559656\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.727901 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles\") pod \"60c1d932-093d-416b-9c58-88ff3d559656\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.727933 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca\") pod \"60c1d932-093d-416b-9c58-88ff3d559656\" (UID: \"60c1d932-093d-416b-9c58-88ff3d559656\") " Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.728968 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60c1d932-093d-416b-9c58-88ff3d559656" (UID: "60c1d932-093d-416b-9c58-88ff3d559656"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.729031 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config" (OuterVolumeSpecName: "config") pod "60c1d932-093d-416b-9c58-88ff3d559656" (UID: "60c1d932-093d-416b-9c58-88ff3d559656"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.729074 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca" (OuterVolumeSpecName: "client-ca") pod "60c1d932-093d-416b-9c58-88ff3d559656" (UID: "60c1d932-093d-416b-9c58-88ff3d559656"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.735626 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn" (OuterVolumeSpecName: "kube-api-access-cp2mn") pod "60c1d932-093d-416b-9c58-88ff3d559656" (UID: "60c1d932-093d-416b-9c58-88ff3d559656"). InnerVolumeSpecName "kube-api-access-cp2mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.736840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60c1d932-093d-416b-9c58-88ff3d559656" (UID: "60c1d932-093d-416b-9c58-88ff3d559656"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.829133 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60c1d932-093d-416b-9c58-88ff3d559656-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.829184 4760 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.829204 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.829239 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2mn\" (UniqueName: \"kubernetes.io/projected/60c1d932-093d-416b-9c58-88ff3d559656-kube-api-access-cp2mn\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.829252 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c1d932-093d-416b-9c58-88ff3d559656-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.995562 4760 generic.go:334] "Generic (PLEG): container finished" podID="60c1d932-093d-416b-9c58-88ff3d559656" containerID="7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6" exitCode=0 Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.995638 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.996998 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" event={"ID":"60c1d932-093d-416b-9c58-88ff3d559656","Type":"ContainerDied","Data":"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6"} Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.997086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7fpdd" event={"ID":"60c1d932-093d-416b-9c58-88ff3d559656","Type":"ContainerDied","Data":"38b37d742ff3aa4166c846a73485934a79500ffe45508c05cb7f3359131bd850"} Dec 04 12:20:19 crc kubenswrapper[4760]: I1204 12:20:19.997110 4760 scope.go:117] "RemoveContainer" containerID="7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.001201 4760 generic.go:334] "Generic (PLEG): container finished" podID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" containerID="bdb449ea01963bcdd046864279afc24ae036c935c568c8e58ee02d511917100e" exitCode=0 Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.001287 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" event={"ID":"46e24d96-40d2-40c8-83cf-cf9bcddb570a","Type":"ContainerDied","Data":"bdb449ea01963bcdd046864279afc24ae036c935c568c8e58ee02d511917100e"} Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.036115 4760 scope.go:117] "RemoveContainer" containerID="7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6" Dec 04 12:20:20 crc kubenswrapper[4760]: E1204 12:20:20.036784 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6\": container with ID starting with 7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6 not found: ID does not exist" containerID="7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.036832 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6"} err="failed to get container status \"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6\": rpc error: code = NotFound desc = could not find container \"7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6\": container with ID starting with 7acdd9267e2b477c0785f90e0d32de575312bdffd8b7f6542a78e1c3ef11bac6 not found: ID does not exist" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.037375 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.041803 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7fpdd"] Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.351731 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.446073 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5rkg\" (UniqueName: \"kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg\") pod \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.446141 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config\") pod \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.446173 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca\") pod \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.446245 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert\") pod \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\" (UID: \"46e24d96-40d2-40c8-83cf-cf9bcddb570a\") " Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.447504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config" (OuterVolumeSpecName: "config") pod "46e24d96-40d2-40c8-83cf-cf9bcddb570a" (UID: "46e24d96-40d2-40c8-83cf-cf9bcddb570a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.447695 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca" (OuterVolumeSpecName: "client-ca") pod "46e24d96-40d2-40c8-83cf-cf9bcddb570a" (UID: "46e24d96-40d2-40c8-83cf-cf9bcddb570a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.452635 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "46e24d96-40d2-40c8-83cf-cf9bcddb570a" (UID: "46e24d96-40d2-40c8-83cf-cf9bcddb570a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.452861 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg" (OuterVolumeSpecName: "kube-api-access-v5rkg") pod "46e24d96-40d2-40c8-83cf-cf9bcddb570a" (UID: "46e24d96-40d2-40c8-83cf-cf9bcddb570a"). InnerVolumeSpecName "kube-api-access-v5rkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.547676 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.547716 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46e24d96-40d2-40c8-83cf-cf9bcddb570a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.547728 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46e24d96-40d2-40c8-83cf-cf9bcddb570a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.547738 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5rkg\" (UniqueName: \"kubernetes.io/projected/46e24d96-40d2-40c8-83cf-cf9bcddb570a-kube-api-access-v5rkg\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.877377 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:20:20 crc kubenswrapper[4760]: E1204 12:20:20.877858 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c1d932-093d-416b-9c58-88ff3d559656" containerName="controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.877881 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c1d932-093d-416b-9c58-88ff3d559656" containerName="controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: E1204 12:20:20.877900 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" containerName="route-controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.877911 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" containerName="route-controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.878098 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" containerName="route-controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.878133 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c1d932-093d-416b-9c58-88ff3d559656" containerName="controller-manager" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.878976 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.885033 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf"] Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.886585 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.889082 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.890306 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.890493 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.890733 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.890793 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.892105 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.892482 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.897698 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf"] Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.900658 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.953963 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.954499 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-proxy-ca-bundles\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.954636 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0141c2-7d40-49c1-b41e-b0f763cb0402-serving-cert\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.954775 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkppg\" (UniqueName: \"kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.954927 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-config\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.955044 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-client-ca\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.955192 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.955352 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:20 crc kubenswrapper[4760]: I1204 12:20:20.955709 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgtd\" (UniqueName: \"kubernetes.io/projected/fd0141c2-7d40-49c1-b41e-b0f763cb0402-kube-api-access-2bgtd\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.010143 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" event={"ID":"46e24d96-40d2-40c8-83cf-cf9bcddb570a","Type":"ContainerDied","Data":"08068c352e339f2942f62ca1301c82ff354e74ffd54fa977cd8b357e07d5b62d"} Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.010256 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.010271 4760 scope.go:117] "RemoveContainer" containerID="bdb449ea01963bcdd046864279afc24ae036c935c568c8e58ee02d511917100e" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.048272 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.051921 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xxt2f"] Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060592 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060662 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-proxy-ca-bundles\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060687 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0141c2-7d40-49c1-b41e-b0f763cb0402-serving-cert\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060717 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkppg\" (UniqueName: \"kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060762 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-config\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060790 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-client-ca\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060819 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.060878 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgtd\" (UniqueName: \"kubernetes.io/projected/fd0141c2-7d40-49c1-b41e-b0f763cb0402-kube-api-access-2bgtd\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.062157 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-client-ca\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.062312 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.062435 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.062514 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-config\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.063455 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fd0141c2-7d40-49c1-b41e-b0f763cb0402-proxy-ca-bundles\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.068128 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0141c2-7d40-49c1-b41e-b0f763cb0402-serving-cert\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.077010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.086348 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkppg\" (UniqueName: \"kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg\") pod \"route-controller-manager-8697489c76-bjkxb\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.087086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgtd\" (UniqueName: \"kubernetes.io/projected/fd0141c2-7d40-49c1-b41e-b0f763cb0402-kube-api-access-2bgtd\") pod \"controller-manager-6ddddfd6f7-s25nf\" (UID: \"fd0141c2-7d40-49c1-b41e-b0f763cb0402\") " pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.211085 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.216888 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.453553 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.516084 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf"] Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.873294 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e24d96-40d2-40c8-83cf-cf9bcddb570a" path="/var/lib/kubelet/pods/46e24d96-40d2-40c8-83cf-cf9bcddb570a/volumes" Dec 04 12:20:21 crc kubenswrapper[4760]: I1204 12:20:21.874785 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c1d932-093d-416b-9c58-88ff3d559656" path="/var/lib/kubelet/pods/60c1d932-093d-416b-9c58-88ff3d559656/volumes" Dec 04 12:20:22 crc kubenswrapper[4760]: I1204 12:20:22.022385 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" event={"ID":"5831b45f-12dc-4c23-87e7-85469ece8a11","Type":"ContainerStarted","Data":"d7de6fe4f53aed46549301ca29411bed7185ced6b461385693b923bf0d2c22bf"} Dec 04 12:20:22 crc kubenswrapper[4760]: I1204 12:20:22.022468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" event={"ID":"5831b45f-12dc-4c23-87e7-85469ece8a11","Type":"ContainerStarted","Data":"f87f36001402265a1cac03e2dd085d09e7106e9ff757f46936f98239209157aa"} Dec 04 12:20:22 crc kubenswrapper[4760]: I1204 12:20:22.026052 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" event={"ID":"fd0141c2-7d40-49c1-b41e-b0f763cb0402","Type":"ContainerStarted","Data":"d9afdb880c20e910fa94233a57e77d0df25390b311e6c5d735131af1adfd31e8"} Dec 04 12:20:22 crc kubenswrapper[4760]: I1204 12:20:22.026156 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" event={"ID":"fd0141c2-7d40-49c1-b41e-b0f763cb0402","Type":"ContainerStarted","Data":"564f5b5c99fd0d3a370a40e2a396202ec0d98d8fcf1a27574a78833c0337125e"} Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.034631 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.035618 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.044042 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.053560 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.066841 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" podStartSLOduration=4.066815747 podStartE2EDuration="4.066815747s" podCreationTimestamp="2025-12-04 12:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:20:23.062316264 +0000 UTC m=+426.103762851" watchObservedRunningTime="2025-12-04 12:20:23.066815747 +0000 UTC m=+426.108262314" Dec 04 12:20:23 crc kubenswrapper[4760]: I1204 12:20:23.097167 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ddddfd6f7-s25nf" podStartSLOduration=4.09713343 podStartE2EDuration="4.09713343s" podCreationTimestamp="2025-12-04 12:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:20:23.09494083 +0000 UTC m=+426.136387397" watchObservedRunningTime="2025-12-04 12:20:23.09713343 +0000 UTC m=+426.138580007" Dec 04 12:20:33 crc kubenswrapper[4760]: I1204 12:20:33.381120 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:20:33 crc kubenswrapper[4760]: I1204 12:20:33.382283 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:20:33 crc kubenswrapper[4760]: I1204 12:20:33.382389 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:20:33 crc kubenswrapper[4760]: I1204 12:20:33.383594 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:20:33 crc kubenswrapper[4760]: I1204 12:20:33.383695 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9" gracePeriod=600 Dec 04 12:20:34 crc kubenswrapper[4760]: I1204 12:20:34.105145 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9" exitCode=0 Dec 04 12:20:34 crc kubenswrapper[4760]: I1204 12:20:34.105680 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9"} Dec 04 12:20:34 crc kubenswrapper[4760]: I1204 12:20:34.105722 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd"} Dec 04 12:20:34 crc kubenswrapper[4760]: I1204 12:20:34.105742 4760 scope.go:117] "RemoveContainer" containerID="c81665eb77c37ed6f538339d1eaeca2608238099d8ff2f1cc5ccbde51de16683" Dec 04 12:20:58 crc kubenswrapper[4760]: I1204 12:20:58.877333 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:20:58 crc kubenswrapper[4760]: I1204 12:20:58.878098 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" podUID="5831b45f-12dc-4c23-87e7-85469ece8a11" containerName="route-controller-manager" containerID="cri-o://d7de6fe4f53aed46549301ca29411bed7185ced6b461385693b923bf0d2c22bf" gracePeriod=30 Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.262455 4760 generic.go:334] "Generic (PLEG): container finished" podID="5831b45f-12dc-4c23-87e7-85469ece8a11" containerID="d7de6fe4f53aed46549301ca29411bed7185ced6b461385693b923bf0d2c22bf" exitCode=0 Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.262605 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" event={"ID":"5831b45f-12dc-4c23-87e7-85469ece8a11","Type":"ContainerDied","Data":"d7de6fe4f53aed46549301ca29411bed7185ced6b461385693b923bf0d2c22bf"} Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.432845 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.497521 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert\") pod \"5831b45f-12dc-4c23-87e7-85469ece8a11\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.497582 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config\") pod \"5831b45f-12dc-4c23-87e7-85469ece8a11\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.497631 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkppg\" (UniqueName: \"kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg\") pod \"5831b45f-12dc-4c23-87e7-85469ece8a11\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.497739 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca\") pod \"5831b45f-12dc-4c23-87e7-85469ece8a11\" (UID: \"5831b45f-12dc-4c23-87e7-85469ece8a11\") " Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.499695 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config" (OuterVolumeSpecName: "config") pod "5831b45f-12dc-4c23-87e7-85469ece8a11" (UID: "5831b45f-12dc-4c23-87e7-85469ece8a11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.501150 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca" (OuterVolumeSpecName: "client-ca") pod "5831b45f-12dc-4c23-87e7-85469ece8a11" (UID: "5831b45f-12dc-4c23-87e7-85469ece8a11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.509642 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5831b45f-12dc-4c23-87e7-85469ece8a11" (UID: "5831b45f-12dc-4c23-87e7-85469ece8a11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.524609 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg" (OuterVolumeSpecName: "kube-api-access-jkppg") pod "5831b45f-12dc-4c23-87e7-85469ece8a11" (UID: "5831b45f-12dc-4c23-87e7-85469ece8a11"). InnerVolumeSpecName "kube-api-access-jkppg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.599473 4760 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5831b45f-12dc-4c23-87e7-85469ece8a11-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.599527 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.599542 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkppg\" (UniqueName: \"kubernetes.io/projected/5831b45f-12dc-4c23-87e7-85469ece8a11-kube-api-access-jkppg\") on node \"crc\" DevicePath \"\"" Dec 04 12:20:59 crc kubenswrapper[4760]: I1204 12:20:59.599556 4760 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5831b45f-12dc-4c23-87e7-85469ece8a11-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.025792 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds"] Dec 04 12:21:00 crc kubenswrapper[4760]: E1204 12:21:00.026560 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5831b45f-12dc-4c23-87e7-85469ece8a11" containerName="route-controller-manager" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.026581 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5831b45f-12dc-4c23-87e7-85469ece8a11" containerName="route-controller-manager" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.026715 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5831b45f-12dc-4c23-87e7-85469ece8a11" containerName="route-controller-manager" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.027336 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.044765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-client-ca\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.044823 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-config\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.044856 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmsr7\" (UniqueName: \"kubernetes.io/projected/62ed054c-a431-405b-9e8f-7a5df685b871-kube-api-access-lmsr7\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.044888 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ed054c-a431-405b-9e8f-7a5df685b871-serving-cert\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.050827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds"] Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.146583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-client-ca\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.146672 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-config\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.146711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmsr7\" (UniqueName: \"kubernetes.io/projected/62ed054c-a431-405b-9e8f-7a5df685b871-kube-api-access-lmsr7\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.146739 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ed054c-a431-405b-9e8f-7a5df685b871-serving-cert\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.148342 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-client-ca\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.149869 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62ed054c-a431-405b-9e8f-7a5df685b871-config\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.152982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62ed054c-a431-405b-9e8f-7a5df685b871-serving-cert\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.169000 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmsr7\" (UniqueName: \"kubernetes.io/projected/62ed054c-a431-405b-9e8f-7a5df685b871-kube-api-access-lmsr7\") pod \"route-controller-manager-dbb45576-7ktds\" (UID: \"62ed054c-a431-405b-9e8f-7a5df685b871\") " pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.271337 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" event={"ID":"5831b45f-12dc-4c23-87e7-85469ece8a11","Type":"ContainerDied","Data":"f87f36001402265a1cac03e2dd085d09e7106e9ff757f46936f98239209157aa"} Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.271427 4760 scope.go:117] "RemoveContainer" containerID="d7de6fe4f53aed46549301ca29411bed7185ced6b461385693b923bf0d2c22bf" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.271842 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.300894 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.309274 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-bjkxb"] Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.346157 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:00 crc kubenswrapper[4760]: I1204 12:21:00.773027 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds"] Dec 04 12:21:01 crc kubenswrapper[4760]: I1204 12:21:01.321066 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" event={"ID":"62ed054c-a431-405b-9e8f-7a5df685b871","Type":"ContainerStarted","Data":"3f9a8d2e1d1a38abec9ccd94bb1a2b652f77cbde78cec12092015f408534dfec"} Dec 04 12:21:01 crc kubenswrapper[4760]: I1204 12:21:01.873999 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5831b45f-12dc-4c23-87e7-85469ece8a11" path="/var/lib/kubelet/pods/5831b45f-12dc-4c23-87e7-85469ece8a11/volumes" Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.436538 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" event={"ID":"62ed054c-a431-405b-9e8f-7a5df685b871","Type":"ContainerStarted","Data":"66bfb5c1dce61afcd5b9e8cb16d594d1a3b0f3f6244f5acd35dc29b9a1ac7e93"} Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.437677 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.508677 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" podStartSLOduration=4.508648264 podStartE2EDuration="4.508648264s" podCreationTimestamp="2025-12-04 12:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:21:02.505155742 +0000 UTC m=+465.546602309" watchObservedRunningTime="2025-12-04 12:21:02.508648264 +0000 UTC m=+465.550094831" Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.527412 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.817892 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xlqtm"] Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.819276 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:02 crc kubenswrapper[4760]: I1204 12:21:02.840544 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xlqtm"] Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.006939 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c66435e7-b564-4895-b5a7-a40ec6bffa21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007139 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvqs7\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-kube-api-access-bvqs7\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007417 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-bound-sa-token\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007506 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c66435e7-b564-4895-b5a7-a40ec6bffa21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007580 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-trusted-ca\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007685 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-certificates\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.007709 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-tls\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.070790 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109020 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-certificates\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109082 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-tls\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109113 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c66435e7-b564-4895-b5a7-a40ec6bffa21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109163 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvqs7\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-kube-api-access-bvqs7\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109238 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-bound-sa-token\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109281 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c66435e7-b564-4895-b5a7-a40ec6bffa21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.109309 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-trusted-ca\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.111244 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-trusted-ca\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.111962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c66435e7-b564-4895-b5a7-a40ec6bffa21-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.112790 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-certificates\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.119727 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-registry-tls\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.122735 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c66435e7-b564-4895-b5a7-a40ec6bffa21-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.129879 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvqs7\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-kube-api-access-bvqs7\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.136551 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c66435e7-b564-4895-b5a7-a40ec6bffa21-bound-sa-token\") pod \"image-registry-66df7c8f76-xlqtm\" (UID: \"c66435e7-b564-4895-b5a7-a40ec6bffa21\") " pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:03 crc kubenswrapper[4760]: I1204 12:21:03.139521 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:04 crc kubenswrapper[4760]: I1204 12:21:04.330689 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xlqtm"] Dec 04 12:21:04 crc kubenswrapper[4760]: I1204 12:21:04.490494 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" event={"ID":"c66435e7-b564-4895-b5a7-a40ec6bffa21","Type":"ContainerStarted","Data":"c19ec215ff6de5ba2a26295f8d169c904d40bab84e95f82f5bdc22a762b45492"} Dec 04 12:21:05 crc kubenswrapper[4760]: I1204 12:21:05.597668 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" event={"ID":"c66435e7-b564-4895-b5a7-a40ec6bffa21","Type":"ContainerStarted","Data":"9202b38825152814c6e69cfcb1a5e5ba2efa312e3c06fd6971f10936ec0edf66"} Dec 04 12:21:05 crc kubenswrapper[4760]: I1204 12:21:05.598395 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.874969 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" podStartSLOduration=7.87494633 podStartE2EDuration="7.87494633s" podCreationTimestamp="2025-12-04 12:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:21:05.630017234 +0000 UTC m=+468.671463801" watchObservedRunningTime="2025-12-04 12:21:09.87494633 +0000 UTC m=+472.916392897" Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.880640 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.880932 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bptn9" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="registry-server" containerID="cri-o://119235e10def2886fe1fde6c3c4a419a48a4ba6457aa9becb2104eced2eedacc" gracePeriod=30 Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.894968 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.913552 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.913847 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94pdb" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="registry-server" containerID="cri-o://770340b81b8a1d50e739eaa57e985bd69f7933784df631cb0d8858e5dc4af17e" gracePeriod=30 Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.942021 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.942752 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rpnt8" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="registry-server" containerID="cri-o://b7f999cf403f1b687eedc308b2a99c22bea2470a7b7979a8c448192ba607bcea" gracePeriod=30 Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.947385 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.947695 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" containerID="cri-o://576ce8d59675149466424ee55ad9092e3f14024e00aba17735a11ccc3cfcb3f8" gracePeriod=30 Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.962456 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.963227 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g48rj" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="registry-server" containerID="cri-o://ce282d39b5518acbb22e2eb04d59aaae0755a087e85d28c3da1445d16f7c6927" gracePeriod=30 Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.976303 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:21:09 crc kubenswrapper[4760]: I1204 12:21:09.976718 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lldkh" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="registry-server" containerID="cri-o://99eaf56a85ea61a2e74d07b1d509e294eacc24a2a6765c773d534ec74aa0a9f9" gracePeriod=30 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.499780 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj92j"] Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.519778 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.522717 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.522864 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.522895 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj92j"] Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.523448 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ql8tz" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="registry-server" containerID="cri-o://4786b09a38b4f96970bb6e7f9f4a6e1d6d38032bff2f27c387bde3b30ec92ac0" gracePeriod=30 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.523948 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pt6wf" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="registry-server" containerID="cri-o://f10d201a5a66cefb4da0e3b8576648f2f82b084861d1cb345ef6f4c4cd1d3ac1" gracePeriod=30 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.595972 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.596095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.596168 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzt9k\" (UniqueName: \"kubernetes.io/projected/df130580-94d3-40cd-a840-c85281e78fcc-kube-api-access-kzt9k\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.674679 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerDied","Data":"b7f999cf403f1b687eedc308b2a99c22bea2470a7b7979a8c448192ba607bcea"} Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.674621 4760 generic.go:334] "Generic (PLEG): container finished" podID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerID="b7f999cf403f1b687eedc308b2a99c22bea2470a7b7979a8c448192ba607bcea" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.699873 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzt9k\" (UniqueName: \"kubernetes.io/projected/df130580-94d3-40cd-a840-c85281e78fcc-kube-api-access-kzt9k\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.700183 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.700316 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.702173 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.704823 4760 generic.go:334] "Generic (PLEG): container finished" podID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerID="119235e10def2886fe1fde6c3c4a419a48a4ba6457aa9becb2104eced2eedacc" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.706111 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerDied","Data":"119235e10def2886fe1fde6c3c4a419a48a4ba6457aa9becb2104eced2eedacc"} Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.710688 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df130580-94d3-40cd-a840-c85281e78fcc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.722065 4760 generic.go:334] "Generic (PLEG): container finished" podID="6ea41f2f-f148-4280-b046-1bea756a117a" containerID="770340b81b8a1d50e739eaa57e985bd69f7933784df631cb0d8858e5dc4af17e" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.722132 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerDied","Data":"770340b81b8a1d50e739eaa57e985bd69f7933784df631cb0d8858e5dc4af17e"} Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.724721 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzt9k\" (UniqueName: \"kubernetes.io/projected/df130580-94d3-40cd-a840-c85281e78fcc-kube-api-access-kzt9k\") pod \"marketplace-operator-79b997595-xj92j\" (UID: \"df130580-94d3-40cd-a840-c85281e78fcc\") " pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.728725 4760 generic.go:334] "Generic (PLEG): container finished" podID="2262a901-d392-434b-bd32-43555b67f428" containerID="576ce8d59675149466424ee55ad9092e3f14024e00aba17735a11ccc3cfcb3f8" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.729149 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerDied","Data":"576ce8d59675149466424ee55ad9092e3f14024e00aba17735a11ccc3cfcb3f8"} Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.729192 4760 scope.go:117] "RemoveContainer" containerID="d1417a77a6420bb4709d113547dfa6fd8bcfa406b64291604746a1c6221f68c2" Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.758439 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerID="99eaf56a85ea61a2e74d07b1d509e294eacc24a2a6765c773d534ec74aa0a9f9" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.758554 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerDied","Data":"99eaf56a85ea61a2e74d07b1d509e294eacc24a2a6765c773d534ec74aa0a9f9"} Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.769081 4760 generic.go:334] "Generic (PLEG): container finished" podID="2a7dec90-d501-40e2-9338-df345c0fd672" containerID="ce282d39b5518acbb22e2eb04d59aaae0755a087e85d28c3da1445d16f7c6927" exitCode=0 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.769333 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l9t6h" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="registry-server" containerID="cri-o://39f80c3ebd54a9e7aa5bb05c5021175eded1ad09a82b361e7b5d5e123f92c125" gracePeriod=30 Dec 04 12:21:10 crc kubenswrapper[4760]: I1204 12:21:10.769657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerDied","Data":"ce282d39b5518acbb22e2eb04d59aaae0755a087e85d28c3da1445d16f7c6927"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.367088 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.377418 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.408798 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.452836 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.539629 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities\") pod \"6ea41f2f-f148-4280-b046-1bea756a117a\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.540029 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9pv2\" (UniqueName: \"kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2\") pod \"6ea41f2f-f148-4280-b046-1bea756a117a\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.540584 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content\") pod \"2a7dec90-d501-40e2-9338-df345c0fd672\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.540614 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities\") pod \"2a7dec90-d501-40e2-9338-df345c0fd672\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.540695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8rnm\" (UniqueName: \"kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm\") pod \"2a7dec90-d501-40e2-9338-df345c0fd672\" (UID: \"2a7dec90-d501-40e2-9338-df345c0fd672\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.540730 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content\") pod \"6ea41f2f-f148-4280-b046-1bea756a117a\" (UID: \"6ea41f2f-f148-4280-b046-1bea756a117a\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.541861 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities" (OuterVolumeSpecName: "utilities") pod "6ea41f2f-f148-4280-b046-1bea756a117a" (UID: "6ea41f2f-f148-4280-b046-1bea756a117a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.542556 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities" (OuterVolumeSpecName: "utilities") pod "2a7dec90-d501-40e2-9338-df345c0fd672" (UID: "2a7dec90-d501-40e2-9338-df345c0fd672"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.557688 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm" (OuterVolumeSpecName: "kube-api-access-m8rnm") pod "2a7dec90-d501-40e2-9338-df345c0fd672" (UID: "2a7dec90-d501-40e2-9338-df345c0fd672"). InnerVolumeSpecName "kube-api-access-m8rnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.566425 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a7dec90-d501-40e2-9338-df345c0fd672" (UID: "2a7dec90-d501-40e2-9338-df345c0fd672"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.575440 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2" (OuterVolumeSpecName: "kube-api-access-k9pv2") pod "6ea41f2f-f148-4280-b046-1bea756a117a" (UID: "6ea41f2f-f148-4280-b046-1bea756a117a"). InnerVolumeSpecName "kube-api-access-k9pv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.631262 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ea41f2f-f148-4280-b046-1bea756a117a" (UID: "6ea41f2f-f148-4280-b046-1bea756a117a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642376 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content\") pod \"a3906714-b46d-4640-be9f-d57ba0fd27bb\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642423 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7mw\" (UniqueName: \"kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw\") pod \"a3906714-b46d-4640-be9f-d57ba0fd27bb\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642530 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities\") pod \"a3906714-b46d-4640-be9f-d57ba0fd27bb\" (UID: \"a3906714-b46d-4640-be9f-d57ba0fd27bb\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642804 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642821 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9pv2\" (UniqueName: \"kubernetes.io/projected/6ea41f2f-f148-4280-b046-1bea756a117a-kube-api-access-k9pv2\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642837 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642849 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a7dec90-d501-40e2-9338-df345c0fd672-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642861 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8rnm\" (UniqueName: \"kubernetes.io/projected/2a7dec90-d501-40e2-9338-df345c0fd672-kube-api-access-m8rnm\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.642872 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea41f2f-f148-4280-b046-1bea756a117a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.643625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities" (OuterVolumeSpecName: "utilities") pod "a3906714-b46d-4640-be9f-d57ba0fd27bb" (UID: "a3906714-b46d-4640-be9f-d57ba0fd27bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.649232 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw" (OuterVolumeSpecName: "kube-api-access-tz7mw") pod "a3906714-b46d-4640-be9f-d57ba0fd27bb" (UID: "a3906714-b46d-4640-be9f-d57ba0fd27bb"). InnerVolumeSpecName "kube-api-access-tz7mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.747821 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.747906 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz7mw\" (UniqueName: \"kubernetes.io/projected/a3906714-b46d-4640-be9f-d57ba0fd27bb-kube-api-access-tz7mw\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.815086 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.851947 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.852999 4760 generic.go:334] "Generic (PLEG): container finished" podID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerID="39f80c3ebd54a9e7aa5bb05c5021175eded1ad09a82b361e7b5d5e123f92c125" exitCode=0 Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.853072 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerDied","Data":"39f80c3ebd54a9e7aa5bb05c5021175eded1ad09a82b361e7b5d5e123f92c125"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.853762 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.857192 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g48rj" event={"ID":"2a7dec90-d501-40e2-9338-df345c0fd672","Type":"ContainerDied","Data":"fa2d0e952155b79cccb3c17739adc5a4aae8c4031ba7b49fe40da7408bbe7234"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.857286 4760 scope.go:117] "RemoveContainer" containerID="ce282d39b5518acbb22e2eb04d59aaae0755a087e85d28c3da1445d16f7c6927" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.857332 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g48rj" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.914534 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bptn9" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.924516 4760 generic.go:334] "Generic (PLEG): container finished" podID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerID="f10d201a5a66cefb4da0e3b8576648f2f82b084861d1cb345ef6f4c4cd1d3ac1" exitCode=0 Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.949292 4760 scope.go:117] "RemoveContainer" containerID="4beb0e3a73ad8f1fbbd17dc1589d6d7800cc9dc0a01e092755f743661b1b373a" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.949521 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bptn9" event={"ID":"a3906714-b46d-4640-be9f-d57ba0fd27bb","Type":"ContainerDied","Data":"57b56145cf76e6e35fca32e8addf4fec11b79d992cacfb8981b2a168e9937709"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.949561 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerDied","Data":"f10d201a5a66cefb4da0e3b8576648f2f82b084861d1cb345ef6f4c4cd1d3ac1"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.950291 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ml6t\" (UniqueName: \"kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t\") pod \"2262a901-d392-434b-bd32-43555b67f428\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.950360 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca\") pod \"2262a901-d392-434b-bd32-43555b67f428\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.950430 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics\") pod \"2262a901-d392-434b-bd32-43555b67f428\" (UID: \"2262a901-d392-434b-bd32-43555b67f428\") " Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.952054 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94pdb" event={"ID":"6ea41f2f-f148-4280-b046-1bea756a117a","Type":"ContainerDied","Data":"386af51fa8df4a09f189354314363334676aad7bb7490e2fc452010cea258f8a"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.952245 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94pdb" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.954914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2262a901-d392-434b-bd32-43555b67f428" (UID: "2262a901-d392-434b-bd32-43555b67f428"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.959364 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2262a901-d392-434b-bd32-43555b67f428" (UID: "2262a901-d392-434b-bd32-43555b67f428"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.961444 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t" (OuterVolumeSpecName: "kube-api-access-6ml6t") pod "2262a901-d392-434b-bd32-43555b67f428" (UID: "2262a901-d392-434b-bd32-43555b67f428"). InnerVolumeSpecName "kube-api-access-6ml6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.971056 4760 generic.go:334] "Generic (PLEG): container finished" podID="78261f38-564b-4487-a29e-5edc6859825e" containerID="4786b09a38b4f96970bb6e7f9f4a6e1d6d38032bff2f27c387bde3b30ec92ac0" exitCode=0 Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.971265 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql8tz" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.971621 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql8tz" event={"ID":"78261f38-564b-4487-a29e-5edc6859825e","Type":"ContainerDied","Data":"4786b09a38b4f96970bb6e7f9f4a6e1d6d38032bff2f27c387bde3b30ec92ac0"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.982692 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.983561 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.986765 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" event={"ID":"2262a901-d392-434b-bd32-43555b67f428","Type":"ContainerDied","Data":"04c3536daae0c404bb35f62d677eba808a56118428e89deb7d9d86062b6a41aa"} Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.986874 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qvdzl" Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.991190 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g48rj"] Dec 04 12:21:11 crc kubenswrapper[4760]: I1204 12:21:11.998134 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.003957 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.004110 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94pdb"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.018742 4760 scope.go:117] "RemoveContainer" containerID="2a59bd0c2604ac0429f134badaeaf8846f1835c01facdff11983413229c7dc61" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.023513 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3906714-b46d-4640-be9f-d57ba0fd27bb" (UID: "a3906714-b46d-4640-be9f-d57ba0fd27bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.044510 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052162 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content\") pod \"5a77f5f7-b738-4cba-94ca-06643a4ad964\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052258 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7kvz\" (UniqueName: \"kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz\") pod \"5a77f5f7-b738-4cba-94ca-06643a4ad964\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052293 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq7j4\" (UniqueName: \"kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4\") pod \"78261f38-564b-4487-a29e-5edc6859825e\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052360 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content\") pod \"78261f38-564b-4487-a29e-5edc6859825e\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052475 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities\") pod \"5a77f5f7-b738-4cba-94ca-06643a4ad964\" (UID: \"5a77f5f7-b738-4cba-94ca-06643a4ad964\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.052530 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities\") pod \"78261f38-564b-4487-a29e-5edc6859825e\" (UID: \"78261f38-564b-4487-a29e-5edc6859825e\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.053005 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ml6t\" (UniqueName: \"kubernetes.io/projected/2262a901-d392-434b-bd32-43555b67f428-kube-api-access-6ml6t\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.053025 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2262a901-d392-434b-bd32-43555b67f428-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.053039 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3906714-b46d-4640-be9f-d57ba0fd27bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.053050 4760 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2262a901-d392-434b-bd32-43555b67f428-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.058290 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities" (OuterVolumeSpecName: "utilities") pod "5a77f5f7-b738-4cba-94ca-06643a4ad964" (UID: "5a77f5f7-b738-4cba-94ca-06643a4ad964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.059461 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities" (OuterVolumeSpecName: "utilities") pod "78261f38-564b-4487-a29e-5edc6859825e" (UID: "78261f38-564b-4487-a29e-5edc6859825e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.064573 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz" (OuterVolumeSpecName: "kube-api-access-h7kvz") pod "5a77f5f7-b738-4cba-94ca-06643a4ad964" (UID: "5a77f5f7-b738-4cba-94ca-06643a4ad964"). InnerVolumeSpecName "kube-api-access-h7kvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.064708 4760 scope.go:117] "RemoveContainer" containerID="119235e10def2886fe1fde6c3c4a419a48a4ba6457aa9becb2104eced2eedacc" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.066092 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4" (OuterVolumeSpecName: "kube-api-access-sq7j4") pod "78261f38-564b-4487-a29e-5edc6859825e" (UID: "78261f38-564b-4487-a29e-5edc6859825e"). InnerVolumeSpecName "kube-api-access-sq7j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.085252 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.091982 4760 scope.go:117] "RemoveContainer" containerID="df4492b64a044a2dee0084a14130bd5a4054f7e69e2bd70b5d62c2e90b565eb8" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.095445 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qvdzl"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.102912 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a77f5f7-b738-4cba-94ca-06643a4ad964" (UID: "5a77f5f7-b738-4cba-94ca-06643a4ad964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.122285 4760 scope.go:117] "RemoveContainer" containerID="c5ef78da41e1d274ca0e9fe46d11797497f9cc1ce8f97d77b4b39e3f1af7d57a" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.148582 4760 scope.go:117] "RemoveContainer" containerID="770340b81b8a1d50e739eaa57e985bd69f7933784df631cb0d8858e5dc4af17e" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153608 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content\") pod \"8ba63da0-8512-4c36-a755-beaa01a7007b\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153666 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content\") pod \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153784 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhnj\" (UniqueName: \"kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj\") pod \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153822 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content\") pod \"55ebb09b-1c59-4289-92f0-847b3c655fa9\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153844 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities\") pod \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\" (UID: \"6d9a363c-3a21-44a8-aeb0-720692d8ee7f\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153897 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities\") pod \"55ebb09b-1c59-4289-92f0-847b3c655fa9\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.153969 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities\") pod \"8ba63da0-8512-4c36-a755-beaa01a7007b\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154050 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65zzq\" (UniqueName: \"kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq\") pod \"8ba63da0-8512-4c36-a755-beaa01a7007b\" (UID: \"8ba63da0-8512-4c36-a755-beaa01a7007b\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154082 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bhrg\" (UniqueName: \"kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg\") pod \"55ebb09b-1c59-4289-92f0-847b3c655fa9\" (UID: \"55ebb09b-1c59-4289-92f0-847b3c655fa9\") " Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154482 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154510 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7kvz\" (UniqueName: \"kubernetes.io/projected/5a77f5f7-b738-4cba-94ca-06643a4ad964-kube-api-access-h7kvz\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154527 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq7j4\" (UniqueName: \"kubernetes.io/projected/78261f38-564b-4487-a29e-5edc6859825e-kube-api-access-sq7j4\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154538 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a77f5f7-b738-4cba-94ca-06643a4ad964-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.154549 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.156957 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities" (OuterVolumeSpecName: "utilities") pod "6d9a363c-3a21-44a8-aeb0-720692d8ee7f" (UID: "6d9a363c-3a21-44a8-aeb0-720692d8ee7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.156972 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities" (OuterVolumeSpecName: "utilities") pod "55ebb09b-1c59-4289-92f0-847b3c655fa9" (UID: "55ebb09b-1c59-4289-92f0-847b3c655fa9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.158138 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj" (OuterVolumeSpecName: "kube-api-access-8lhnj") pod "6d9a363c-3a21-44a8-aeb0-720692d8ee7f" (UID: "6d9a363c-3a21-44a8-aeb0-720692d8ee7f"). InnerVolumeSpecName "kube-api-access-8lhnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.159946 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq" (OuterVolumeSpecName: "kube-api-access-65zzq") pod "8ba63da0-8512-4c36-a755-beaa01a7007b" (UID: "8ba63da0-8512-4c36-a755-beaa01a7007b"). InnerVolumeSpecName "kube-api-access-65zzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.160528 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities" (OuterVolumeSpecName: "utilities") pod "8ba63da0-8512-4c36-a755-beaa01a7007b" (UID: "8ba63da0-8512-4c36-a755-beaa01a7007b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.162529 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg" (OuterVolumeSpecName: "kube-api-access-8bhrg") pod "55ebb09b-1c59-4289-92f0-847b3c655fa9" (UID: "55ebb09b-1c59-4289-92f0-847b3c655fa9"). InnerVolumeSpecName "kube-api-access-8bhrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.166031 4760 scope.go:117] "RemoveContainer" containerID="ec6127f8fb88f77dc615526d2897a00457f626bf7cd8ddf707990c1f09207f0f" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.189191 4760 scope.go:117] "RemoveContainer" containerID="fbaff96097442a615946b8877549b35b4f44b18bac0cbcd80208e596bd05e52a" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.189247 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78261f38-564b-4487-a29e-5edc6859825e" (UID: "78261f38-564b-4487-a29e-5edc6859825e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.203474 4760 scope.go:117] "RemoveContainer" containerID="4786b09a38b4f96970bb6e7f9f4a6e1d6d38032bff2f27c387bde3b30ec92ac0" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.220545 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d9a363c-3a21-44a8-aeb0-720692d8ee7f" (UID: "6d9a363c-3a21-44a8-aeb0-720692d8ee7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.222165 4760 scope.go:117] "RemoveContainer" containerID="19b991656209bb1d3082c04e42be7b8a67a6db8c0e9ec39c04a018804db3bfa9" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.222593 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55ebb09b-1c59-4289-92f0-847b3c655fa9" (UID: "55ebb09b-1c59-4289-92f0-847b3c655fa9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257823 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65zzq\" (UniqueName: \"kubernetes.io/projected/8ba63da0-8512-4c36-a755-beaa01a7007b-kube-api-access-65zzq\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257881 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bhrg\" (UniqueName: \"kubernetes.io/projected/55ebb09b-1c59-4289-92f0-847b3c655fa9-kube-api-access-8bhrg\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257903 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257915 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhnj\" (UniqueName: \"kubernetes.io/projected/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-kube-api-access-8lhnj\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257940 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257954 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d9a363c-3a21-44a8-aeb0-720692d8ee7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257970 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ebb09b-1c59-4289-92f0-847b3c655fa9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257981 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.257993 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78261f38-564b-4487-a29e-5edc6859825e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.262450 4760 scope.go:117] "RemoveContainer" containerID="43709f0621c9ab89ea7b79b089e8a59039df9b4ca6f5627c5f24e46aebb4b0e8" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.265332 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.274597 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bptn9"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.281160 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xj92j"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.284548 4760 scope.go:117] "RemoveContainer" containerID="576ce8d59675149466424ee55ad9092e3f14024e00aba17735a11ccc3cfcb3f8" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.306047 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.311063 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ql8tz"] Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.326783 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba63da0-8512-4c36-a755-beaa01a7007b" (UID: "8ba63da0-8512-4c36-a755-beaa01a7007b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:12 crc kubenswrapper[4760]: I1204 12:21:12.359012 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba63da0-8512-4c36-a755-beaa01a7007b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.000004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9t6h" event={"ID":"55ebb09b-1c59-4289-92f0-847b3c655fa9","Type":"ContainerDied","Data":"b86859b48d41b8ef014530a58e3298e33079c694d07a39ccc07d8df57a61a9d2"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.000076 4760 scope.go:117] "RemoveContainer" containerID="39f80c3ebd54a9e7aa5bb05c5021175eded1ad09a82b361e7b5d5e123f92c125" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.000189 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9t6h" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.006201 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpnt8" event={"ID":"6d9a363c-3a21-44a8-aeb0-720692d8ee7f","Type":"ContainerDied","Data":"d0d456f6813fb7618cb2fa3e1d2f149698da3c2caf32c0843f45772b543d5786"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.006361 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpnt8" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.008993 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" event={"ID":"df130580-94d3-40cd-a840-c85281e78fcc","Type":"ContainerStarted","Data":"1bd1b630d2e661223015adeeb6bb3efa37d75cc059af02232ae9b593e6a1437d"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.009043 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" event={"ID":"df130580-94d3-40cd-a840-c85281e78fcc","Type":"ContainerStarted","Data":"a517ebd19561c93219ddc69cc637eeed26ef153bd604e037112ac5811b68508e"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.010889 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.024093 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.028806 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lldkh" event={"ID":"5a77f5f7-b738-4cba-94ca-06643a4ad964","Type":"ContainerDied","Data":"df6265cf23803295c0706cfb160372eab67fcf95009963d2f24d81e7fbd62880"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.028941 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lldkh" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.031840 4760 scope.go:117] "RemoveContainer" containerID="af43d776df0a39cf84907aad72f960bd96624ca3b91071e1448c373b108987c8" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.053060 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xj92j" podStartSLOduration=4.053035965 podStartE2EDuration="4.053035965s" podCreationTimestamp="2025-12-04 12:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:21:13.046076724 +0000 UTC m=+476.087523291" watchObservedRunningTime="2025-12-04 12:21:13.053035965 +0000 UTC m=+476.094482532" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.053885 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt6wf" event={"ID":"8ba63da0-8512-4c36-a755-beaa01a7007b","Type":"ContainerDied","Data":"213aec7054610f353d35bab5ab3f2799e54de8b4765e3cfaa1aef9fe4d2af680"} Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.054051 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt6wf" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.078806 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.082514 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rpnt8"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.087196 4760 scope.go:117] "RemoveContainer" containerID="3dd69ee3c13029103989b75db06b5be2d7000180ef8b8f9e76c8b6491ff7645b" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.097729 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.103238 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l9t6h"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.111675 4760 scope.go:117] "RemoveContainer" containerID="b7f999cf403f1b687eedc308b2a99c22bea2470a7b7979a8c448192ba607bcea" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.149357 4760 scope.go:117] "RemoveContainer" containerID="693d2d713f8c1b4d0c2e6dae38ca865ee958799a3f86d0e8ed2fc1a8b6ddd341" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.152511 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.155706 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pt6wf"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.172762 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.177431 4760 scope.go:117] "RemoveContainer" containerID="7c34bd6de2857c66edcb3c879077c02de303f2a48a4e08776b7b3fef34b60f93" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.182033 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lldkh"] Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.197440 4760 scope.go:117] "RemoveContainer" containerID="99eaf56a85ea61a2e74d07b1d509e294eacc24a2a6765c773d534ec74aa0a9f9" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.212584 4760 scope.go:117] "RemoveContainer" containerID="66ddaf89718a29f54023d398b58ca5628218dfded3deb61e3d36ac0ae15e41de" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.230986 4760 scope.go:117] "RemoveContainer" containerID="85d944f1c881f8197da5916e6ec72d5f739ef6c742ca85db3f039de267c54fa8" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.246848 4760 scope.go:117] "RemoveContainer" containerID="f10d201a5a66cefb4da0e3b8576648f2f82b084861d1cb345ef6f4c4cd1d3ac1" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.268898 4760 scope.go:117] "RemoveContainer" containerID="68efd119a41a62a8cbf1f91b611b6691bf1d6d892aaf96f066be8d3861b6eadc" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.293612 4760 scope.go:117] "RemoveContainer" containerID="639ba6555c8b98855087ce53fa165a12f748d0956794344638c4a4db0deb5094" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.874574 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2262a901-d392-434b-bd32-43555b67f428" path="/var/lib/kubelet/pods/2262a901-d392-434b-bd32-43555b67f428/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.875151 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" path="/var/lib/kubelet/pods/2a7dec90-d501-40e2-9338-df345c0fd672/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.875879 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" path="/var/lib/kubelet/pods/55ebb09b-1c59-4289-92f0-847b3c655fa9/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.877406 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" path="/var/lib/kubelet/pods/5a77f5f7-b738-4cba-94ca-06643a4ad964/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.878102 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" path="/var/lib/kubelet/pods/6d9a363c-3a21-44a8-aeb0-720692d8ee7f/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.879152 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" path="/var/lib/kubelet/pods/6ea41f2f-f148-4280-b046-1bea756a117a/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.879778 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78261f38-564b-4487-a29e-5edc6859825e" path="/var/lib/kubelet/pods/78261f38-564b-4487-a29e-5edc6859825e/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.880437 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" path="/var/lib/kubelet/pods/8ba63da0-8512-4c36-a755-beaa01a7007b/volumes" Dec 04 12:21:13 crc kubenswrapper[4760]: I1204 12:21:13.881856 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" path="/var/lib/kubelet/pods/a3906714-b46d-4640-be9f-d57ba0fd27bb/volumes" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.148240 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rkrwq"] Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149321 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149345 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149363 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149371 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149386 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149393 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149405 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149413 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149423 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149428 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149436 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149442 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149454 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149460 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149468 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149474 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149481 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149489 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149499 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149505 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149513 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149519 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149531 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149537 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149545 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149551 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149560 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149566 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149574 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149580 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149588 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149593 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149602 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149607 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149615 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149620 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149629 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149635 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149641 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149647 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149656 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149661 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149668 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149675 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="extract-content" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149683 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149688 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149695 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149700 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149708 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149715 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="extract-utilities" Dec 04 12:21:15 crc kubenswrapper[4760]: E1204 12:21:15.149725 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149731 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149825 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba63da0-8512-4c36-a755-beaa01a7007b" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149837 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3906714-b46d-4640-be9f-d57ba0fd27bb" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149846 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea41f2f-f148-4280-b046-1bea756a117a" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149855 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9a363c-3a21-44a8-aeb0-720692d8ee7f" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149864 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149871 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="78261f38-564b-4487-a29e-5edc6859825e" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149878 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a77f5f7-b738-4cba-94ca-06643a4ad964" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149887 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7dec90-d501-40e2-9338-df345c0fd672" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.149893 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ebb09b-1c59-4289-92f0-847b3c655fa9" containerName="registry-server" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.150050 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2262a901-d392-434b-bd32-43555b67f428" containerName="marketplace-operator" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.150678 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.155679 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.164160 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rkrwq"] Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.309360 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-utilities\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.309446 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p78j\" (UniqueName: \"kubernetes.io/projected/40139a83-284c-4524-90c5-d20d77d6c286-kube-api-access-2p78j\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.309514 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-catalog-content\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.410703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p78j\" (UniqueName: \"kubernetes.io/projected/40139a83-284c-4524-90c5-d20d77d6c286-kube-api-access-2p78j\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.410812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-catalog-content\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.410879 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-utilities\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.411451 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-utilities\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.411722 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40139a83-284c-4524-90c5-d20d77d6c286-catalog-content\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.433742 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p78j\" (UniqueName: \"kubernetes.io/projected/40139a83-284c-4524-90c5-d20d77d6c286-kube-api-access-2p78j\") pod \"redhat-operators-rkrwq\" (UID: \"40139a83-284c-4524-90c5-d20d77d6c286\") " pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.475029 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.748638 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.751310 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.761526 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.768918 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.817998 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbmq\" (UniqueName: \"kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.818128 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.818169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.898941 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rkrwq"] Dec 04 12:21:15 crc kubenswrapper[4760]: W1204 12:21:15.905884 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40139a83_284c_4524_90c5_d20d77d6c286.slice/crio-5059ca2c916ee8e7fc9d68a49cd1e9d5f84b03944644ab5da2138c6d2838ede5 WatchSource:0}: Error finding container 5059ca2c916ee8e7fc9d68a49cd1e9d5f84b03944644ab5da2138c6d2838ede5: Status 404 returned error can't find the container with id 5059ca2c916ee8e7fc9d68a49cd1e9d5f84b03944644ab5da2138c6d2838ede5 Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.919808 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.919869 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.919940 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbmq\" (UniqueName: \"kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.920348 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.920406 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:15 crc kubenswrapper[4760]: I1204 12:21:15.951630 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbmq\" (UniqueName: \"kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq\") pod \"community-operators-qd697\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.071395 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.093309 4760 generic.go:334] "Generic (PLEG): container finished" podID="40139a83-284c-4524-90c5-d20d77d6c286" containerID="b14c96d307c771e1a17d0ca5adeb86859bc1ed8edd171d5d15ce12be3aa3d6cd" exitCode=0 Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.093365 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkrwq" event={"ID":"40139a83-284c-4524-90c5-d20d77d6c286","Type":"ContainerDied","Data":"b14c96d307c771e1a17d0ca5adeb86859bc1ed8edd171d5d15ce12be3aa3d6cd"} Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.093397 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkrwq" event={"ID":"40139a83-284c-4524-90c5-d20d77d6c286","Type":"ContainerStarted","Data":"5059ca2c916ee8e7fc9d68a49cd1e9d5f84b03944644ab5da2138c6d2838ede5"} Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.096832 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:21:16 crc kubenswrapper[4760]: I1204 12:21:16.492467 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.106636 4760 generic.go:334] "Generic (PLEG): container finished" podID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerID="6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef" exitCode=0 Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.106869 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerDied","Data":"6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef"} Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.107046 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerStarted","Data":"c0611821daf6026ee1261180748e68287432564f206a0c0d69decf38fb3447aa"} Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.111080 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkrwq" event={"ID":"40139a83-284c-4524-90c5-d20d77d6c286","Type":"ContainerStarted","Data":"b0e9ca68c5b4631251be3ce6c787435912947f9f3500ac4fbb7da0cb5c1293f0"} Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.551290 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h9cp7"] Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.552777 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.556883 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.571963 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9cp7"] Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.654595 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-catalog-content\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.654719 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbh8z\" (UniqueName: \"kubernetes.io/projected/43708b75-fa1d-4306-90e2-5d057baed057-kube-api-access-mbh8z\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.654740 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-utilities\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.755726 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbh8z\" (UniqueName: \"kubernetes.io/projected/43708b75-fa1d-4306-90e2-5d057baed057-kube-api-access-mbh8z\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.755792 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-utilities\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.755845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-catalog-content\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.756445 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-catalog-content\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.757050 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43708b75-fa1d-4306-90e2-5d057baed057-utilities\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.781264 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbh8z\" (UniqueName: \"kubernetes.io/projected/43708b75-fa1d-4306-90e2-5d057baed057-kube-api-access-mbh8z\") pod \"redhat-marketplace-h9cp7\" (UID: \"43708b75-fa1d-4306-90e2-5d057baed057\") " pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.894777 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 12:21:17 crc kubenswrapper[4760]: I1204 12:21:17.902896 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.129372 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerStarted","Data":"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8"} Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.159109 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6p954"] Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.167451 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.172446 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.190194 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h9cp7"] Dec 04 12:21:18 crc kubenswrapper[4760]: W1204 12:21:18.198032 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43708b75_fa1d_4306_90e2_5d057baed057.slice/crio-ccdf6bf3970246568e9d90dbc6822cf33b891fa7f271e4bc7d10f3cf2be1db62 WatchSource:0}: Error finding container ccdf6bf3970246568e9d90dbc6822cf33b891fa7f271e4bc7d10f3cf2be1db62: Status 404 returned error can't find the container with id ccdf6bf3970246568e9d90dbc6822cf33b891fa7f271e4bc7d10f3cf2be1db62 Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.206231 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6p954"] Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.365694 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-catalog-content\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.366106 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lz9\" (UniqueName: \"kubernetes.io/projected/d72c0613-6684-4d59-9968-065130b7b861-kube-api-access-85lz9\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.366615 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-utilities\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.468888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-catalog-content\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.469069 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lz9\" (UniqueName: \"kubernetes.io/projected/d72c0613-6684-4d59-9968-065130b7b861-kube-api-access-85lz9\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.469230 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-utilities\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.469683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-catalog-content\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.469828 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d72c0613-6684-4d59-9968-065130b7b861-utilities\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.491738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lz9\" (UniqueName: \"kubernetes.io/projected/d72c0613-6684-4d59-9968-065130b7b861-kube-api-access-85lz9\") pod \"certified-operators-6p954\" (UID: \"d72c0613-6684-4d59-9968-065130b7b861\") " pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.520014 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:18 crc kubenswrapper[4760]: I1204 12:21:18.734559 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6p954"] Dec 04 12:21:18 crc kubenswrapper[4760]: W1204 12:21:18.745638 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd72c0613_6684_4d59_9968_065130b7b861.slice/crio-fdd8d1d92088e238c1580a8c2827d183171989dee18d7861a67f6f62866eb5c9 WatchSource:0}: Error finding container fdd8d1d92088e238c1580a8c2827d183171989dee18d7861a67f6f62866eb5c9: Status 404 returned error can't find the container with id fdd8d1d92088e238c1580a8c2827d183171989dee18d7861a67f6f62866eb5c9 Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.135004 4760 generic.go:334] "Generic (PLEG): container finished" podID="40139a83-284c-4524-90c5-d20d77d6c286" containerID="b0e9ca68c5b4631251be3ce6c787435912947f9f3500ac4fbb7da0cb5c1293f0" exitCode=0 Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.135092 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkrwq" event={"ID":"40139a83-284c-4524-90c5-d20d77d6c286","Type":"ContainerDied","Data":"b0e9ca68c5b4631251be3ce6c787435912947f9f3500ac4fbb7da0cb5c1293f0"} Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.137609 4760 generic.go:334] "Generic (PLEG): container finished" podID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerID="d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8" exitCode=0 Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.138391 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerDied","Data":"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8"} Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.139512 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6p954" event={"ID":"d72c0613-6684-4d59-9968-065130b7b861","Type":"ContainerStarted","Data":"fdd8d1d92088e238c1580a8c2827d183171989dee18d7861a67f6f62866eb5c9"} Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.142797 4760 generic.go:334] "Generic (PLEG): container finished" podID="43708b75-fa1d-4306-90e2-5d057baed057" containerID="1dd535b25136b02f62f05814d7cc9ce035753b6e27977cdc77748c9bea4546db" exitCode=0 Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.142840 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9cp7" event={"ID":"43708b75-fa1d-4306-90e2-5d057baed057","Type":"ContainerDied","Data":"1dd535b25136b02f62f05814d7cc9ce035753b6e27977cdc77748c9bea4546db"} Dec 04 12:21:19 crc kubenswrapper[4760]: I1204 12:21:19.142868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9cp7" event={"ID":"43708b75-fa1d-4306-90e2-5d057baed057","Type":"ContainerStarted","Data":"ccdf6bf3970246568e9d90dbc6822cf33b891fa7f271e4bc7d10f3cf2be1db62"} Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.149260 4760 generic.go:334] "Generic (PLEG): container finished" podID="d72c0613-6684-4d59-9968-065130b7b861" containerID="f19e1875f093531063bfabab79e39f44c95193853bdd2e23a22ddac062f122ee" exitCode=0 Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.149550 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6p954" event={"ID":"d72c0613-6684-4d59-9968-065130b7b861","Type":"ContainerDied","Data":"f19e1875f093531063bfabab79e39f44c95193853bdd2e23a22ddac062f122ee"} Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.158843 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rkrwq" event={"ID":"40139a83-284c-4524-90c5-d20d77d6c286","Type":"ContainerStarted","Data":"42cff5bac195795d41c7432dc67ab371bfdd1d796394d4e0d988c053ea97b3dd"} Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.164465 4760 generic.go:334] "Generic (PLEG): container finished" podID="43708b75-fa1d-4306-90e2-5d057baed057" containerID="85bfe52963f01f21d0de179e3fbebdd51c21b8f7308b2b197b2dadb4b0da5087" exitCode=0 Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.164535 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9cp7" event={"ID":"43708b75-fa1d-4306-90e2-5d057baed057","Type":"ContainerDied","Data":"85bfe52963f01f21d0de179e3fbebdd51c21b8f7308b2b197b2dadb4b0da5087"} Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.167661 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerStarted","Data":"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c"} Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.194502 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qd697" podStartSLOduration=2.754620363 podStartE2EDuration="5.194479463s" podCreationTimestamp="2025-12-04 12:21:15 +0000 UTC" firstStartedPulling="2025-12-04 12:21:17.108010802 +0000 UTC m=+480.149457369" lastFinishedPulling="2025-12-04 12:21:19.547869902 +0000 UTC m=+482.589316469" observedRunningTime="2025-12-04 12:21:20.193824732 +0000 UTC m=+483.235271299" watchObservedRunningTime="2025-12-04 12:21:20.194479463 +0000 UTC m=+483.235926030" Dec 04 12:21:20 crc kubenswrapper[4760]: I1204 12:21:20.238854 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rkrwq" podStartSLOduration=1.719667114 podStartE2EDuration="5.238833383s" podCreationTimestamp="2025-12-04 12:21:15 +0000 UTC" firstStartedPulling="2025-12-04 12:21:16.095471936 +0000 UTC m=+479.136918503" lastFinishedPulling="2025-12-04 12:21:19.614638205 +0000 UTC m=+482.656084772" observedRunningTime="2025-12-04 12:21:20.237254053 +0000 UTC m=+483.278700630" watchObservedRunningTime="2025-12-04 12:21:20.238833383 +0000 UTC m=+483.280279950" Dec 04 12:21:21 crc kubenswrapper[4760]: I1204 12:21:21.180410 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h9cp7" event={"ID":"43708b75-fa1d-4306-90e2-5d057baed057","Type":"ContainerStarted","Data":"4c35e566f63062baf8c251cddb9b8134df391093aa2994a91a5cec02aa3b9225"} Dec 04 12:21:21 crc kubenswrapper[4760]: I1204 12:21:21.217872 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h9cp7" podStartSLOduration=2.54056613 podStartE2EDuration="4.217847533s" podCreationTimestamp="2025-12-04 12:21:17 +0000 UTC" firstStartedPulling="2025-12-04 12:21:19.146757598 +0000 UTC m=+482.188204165" lastFinishedPulling="2025-12-04 12:21:20.824039001 +0000 UTC m=+483.865485568" observedRunningTime="2025-12-04 12:21:21.215720215 +0000 UTC m=+484.257166802" watchObservedRunningTime="2025-12-04 12:21:21.217847533 +0000 UTC m=+484.259294100" Dec 04 12:21:23 crc kubenswrapper[4760]: I1204 12:21:23.149659 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xlqtm" Dec 04 12:21:23 crc kubenswrapper[4760]: I1204 12:21:23.204122 4760 generic.go:334] "Generic (PLEG): container finished" podID="d72c0613-6684-4d59-9968-065130b7b861" containerID="948bbb970f900ed618034d67bf6cc590e48a9c860ce54b6beb679738cb8020a8" exitCode=0 Dec 04 12:21:23 crc kubenswrapper[4760]: I1204 12:21:23.204179 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6p954" event={"ID":"d72c0613-6684-4d59-9968-065130b7b861","Type":"ContainerDied","Data":"948bbb970f900ed618034d67bf6cc590e48a9c860ce54b6beb679738cb8020a8"} Dec 04 12:21:23 crc kubenswrapper[4760]: I1204 12:21:23.230241 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:21:24 crc kubenswrapper[4760]: I1204 12:21:24.225991 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6p954" event={"ID":"d72c0613-6684-4d59-9968-065130b7b861","Type":"ContainerStarted","Data":"e6c3b64c0fba96aec1fda1317dd489c815b51c39ae33cad9a90938c647f9319e"} Dec 04 12:21:24 crc kubenswrapper[4760]: I1204 12:21:24.245355 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6p954" podStartSLOduration=2.682778869 podStartE2EDuration="6.245133271s" podCreationTimestamp="2025-12-04 12:21:18 +0000 UTC" firstStartedPulling="2025-12-04 12:21:20.150737552 +0000 UTC m=+483.192184119" lastFinishedPulling="2025-12-04 12:21:23.713091954 +0000 UTC m=+486.754538521" observedRunningTime="2025-12-04 12:21:24.244913494 +0000 UTC m=+487.286360061" watchObservedRunningTime="2025-12-04 12:21:24.245133271 +0000 UTC m=+487.286579828" Dec 04 12:21:25 crc kubenswrapper[4760]: I1204 12:21:25.475390 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:25 crc kubenswrapper[4760]: I1204 12:21:25.476745 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:26 crc kubenswrapper[4760]: I1204 12:21:26.072315 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:26 crc kubenswrapper[4760]: I1204 12:21:26.072388 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:26 crc kubenswrapper[4760]: I1204 12:21:26.118850 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:26 crc kubenswrapper[4760]: I1204 12:21:26.285687 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:21:26 crc kubenswrapper[4760]: I1204 12:21:26.531530 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rkrwq" podUID="40139a83-284c-4524-90c5-d20d77d6c286" containerName="registry-server" probeResult="failure" output=< Dec 04 12:21:26 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 12:21:26 crc kubenswrapper[4760]: > Dec 04 12:21:27 crc kubenswrapper[4760]: I1204 12:21:27.904141 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:27 crc kubenswrapper[4760]: I1204 12:21:27.904572 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:27 crc kubenswrapper[4760]: I1204 12:21:27.946699 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:28 crc kubenswrapper[4760]: I1204 12:21:28.291498 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h9cp7" Dec 04 12:21:28 crc kubenswrapper[4760]: I1204 12:21:28.521642 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:28 crc kubenswrapper[4760]: I1204 12:21:28.521722 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:28 crc kubenswrapper[4760]: I1204 12:21:28.568251 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:29 crc kubenswrapper[4760]: I1204 12:21:29.297139 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6p954" Dec 04 12:21:35 crc kubenswrapper[4760]: I1204 12:21:35.522980 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:35 crc kubenswrapper[4760]: I1204 12:21:35.648771 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rkrwq" Dec 04 12:21:48 crc kubenswrapper[4760]: I1204 12:21:48.269935 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerName="registry" containerID="cri-o://8ca9cfa98038a06ceec07bb193045ae851539156112b8d8efa1bd65d7740cfa9" gracePeriod=30 Dec 04 12:21:49 crc kubenswrapper[4760]: I1204 12:21:49.682312 4760 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-cwwhx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" start-of-body= Dec 04 12:21:49 crc kubenswrapper[4760]: I1204 12:21:49.682385 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" Dec 04 12:21:51 crc kubenswrapper[4760]: I1204 12:21:51.433464 4760 generic.go:334] "Generic (PLEG): container finished" podID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerID="8ca9cfa98038a06ceec07bb193045ae851539156112b8d8efa1bd65d7740cfa9" exitCode=0 Dec 04 12:21:51 crc kubenswrapper[4760]: I1204 12:21:51.433549 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" event={"ID":"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b","Type":"ContainerDied","Data":"8ca9cfa98038a06ceec07bb193045ae851539156112b8d8efa1bd65d7740cfa9"} Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.351436 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.440396 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" event={"ID":"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b","Type":"ContainerDied","Data":"d404149401a41b920aeb26138ed81ab5d8828e1d1876b302e042ff1c1d62b329"} Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.440456 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cwwhx" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.440459 4760 scope.go:117] "RemoveContainer" containerID="8ca9cfa98038a06ceec07bb193045ae851539156112b8d8efa1bd65d7740cfa9" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465246 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465348 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465399 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465428 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465473 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmw5j\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465505 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465538 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.465565 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca\") pod \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\" (UID: \"1081e99d-fbd4-4c6d-a1b5-19613da9ac2b\") " Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.466861 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.467024 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.471666 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.476152 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j" (OuterVolumeSpecName: "kube-api-access-jmw5j") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "kube-api-access-jmw5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.476486 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.476653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.480251 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.485955 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" (UID: "1081e99d-fbd4-4c6d-a1b5-19613da9ac2b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567511 4760 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567558 4760 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567570 4760 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567581 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmw5j\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-kube-api-access-jmw5j\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567591 4760 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567601 4760 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.567612 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.770312 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:21:52 crc kubenswrapper[4760]: I1204 12:21:52.774918 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cwwhx"] Dec 04 12:21:53 crc kubenswrapper[4760]: I1204 12:21:53.871907 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" path="/var/lib/kubelet/pods/1081e99d-fbd4-4c6d-a1b5-19613da9ac2b/volumes" Dec 04 12:22:33 crc kubenswrapper[4760]: I1204 12:22:33.380591 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:22:33 crc kubenswrapper[4760]: I1204 12:22:33.381170 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:23:03 crc kubenswrapper[4760]: I1204 12:23:03.380916 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:23:03 crc kubenswrapper[4760]: I1204 12:23:03.381519 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:23:33 crc kubenswrapper[4760]: I1204 12:23:33.380581 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:23:33 crc kubenswrapper[4760]: I1204 12:23:33.381354 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:23:33 crc kubenswrapper[4760]: I1204 12:23:33.381465 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:23:33 crc kubenswrapper[4760]: I1204 12:23:33.382514 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:23:33 crc kubenswrapper[4760]: I1204 12:23:33.382657 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd" gracePeriod=600 Dec 04 12:23:34 crc kubenswrapper[4760]: I1204 12:23:34.306591 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd" exitCode=0 Dec 04 12:23:34 crc kubenswrapper[4760]: I1204 12:23:34.306691 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd"} Dec 04 12:23:34 crc kubenswrapper[4760]: I1204 12:23:34.306984 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba"} Dec 04 12:23:34 crc kubenswrapper[4760]: I1204 12:23:34.307014 4760 scope.go:117] "RemoveContainer" containerID="98eec381a8f810288ff27a9094a0c5f872e203c92b3c99fd590046c2ebbea2b9" Dec 04 12:23:41 crc kubenswrapper[4760]: I1204 12:23:41.353398 4760 patch_prober.go:28] interesting pod/route-controller-manager-dbb45576-7ktds container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 12:23:41 crc kubenswrapper[4760]: I1204 12:23:41.353418 4760 patch_prober.go:28] interesting pod/route-controller-manager-dbb45576-7ktds container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 12:23:41 crc kubenswrapper[4760]: I1204 12:23:41.354092 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" podUID="62ed054c-a431-405b-9e8f-7a5df685b871" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:23:41 crc kubenswrapper[4760]: I1204 12:23:41.354169 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dbb45576-7ktds" podUID="62ed054c-a431-405b-9e8f-7a5df685b871" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:25:33 crc kubenswrapper[4760]: I1204 12:25:33.380839 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:25:33 crc kubenswrapper[4760]: I1204 12:25:33.381467 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:26:03 crc kubenswrapper[4760]: I1204 12:26:03.380414 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:26:03 crc kubenswrapper[4760]: I1204 12:26:03.381441 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:26:17 crc kubenswrapper[4760]: I1204 12:26:17.014668 4760 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.356903 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q62nq"] Dec 04 12:26:29 crc kubenswrapper[4760]: E1204 12:26:29.357789 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerName="registry" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.357811 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerName="registry" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.357947 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1081e99d-fbd4-4c6d-a1b5-19613da9ac2b" containerName="registry" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.358474 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.361747 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-cdd26" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.362598 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.362752 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.369307 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z26nl"] Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.370058 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-z26nl" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.372567 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bgfcv" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.387944 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q62nq"] Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.393230 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z26nl"] Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.403616 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gs5vd"] Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.404488 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.408318 4760 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cr2cs" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.420272 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gs5vd"] Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.490329 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnqdl\" (UniqueName: \"kubernetes.io/projected/e6ca39a1-4f59-4e58-85a9-eb60075647a8-kube-api-access-dnqdl\") pod \"cert-manager-cainjector-7f985d654d-q62nq\" (UID: \"e6ca39a1-4f59-4e58-85a9-eb60075647a8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.490410 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqcz\" (UniqueName: \"kubernetes.io/projected/7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc-kube-api-access-xlqcz\") pod \"cert-manager-5b446d88c5-z26nl\" (UID: \"7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc\") " pod="cert-manager/cert-manager-5b446d88c5-z26nl" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.592082 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnqdl\" (UniqueName: \"kubernetes.io/projected/e6ca39a1-4f59-4e58-85a9-eb60075647a8-kube-api-access-dnqdl\") pod \"cert-manager-cainjector-7f985d654d-q62nq\" (UID: \"e6ca39a1-4f59-4e58-85a9-eb60075647a8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.592151 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqcz\" (UniqueName: \"kubernetes.io/projected/7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc-kube-api-access-xlqcz\") pod \"cert-manager-5b446d88c5-z26nl\" (UID: \"7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc\") " pod="cert-manager/cert-manager-5b446d88c5-z26nl" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.592243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqt6s\" (UniqueName: \"kubernetes.io/projected/6e691fa7-b524-4703-9ba2-9b5d2936deef-kube-api-access-zqt6s\") pod \"cert-manager-webhook-5655c58dd6-gs5vd\" (UID: \"6e691fa7-b524-4703-9ba2-9b5d2936deef\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.613129 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqcz\" (UniqueName: \"kubernetes.io/projected/7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc-kube-api-access-xlqcz\") pod \"cert-manager-5b446d88c5-z26nl\" (UID: \"7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc\") " pod="cert-manager/cert-manager-5b446d88c5-z26nl" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.618276 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnqdl\" (UniqueName: \"kubernetes.io/projected/e6ca39a1-4f59-4e58-85a9-eb60075647a8-kube-api-access-dnqdl\") pod \"cert-manager-cainjector-7f985d654d-q62nq\" (UID: \"e6ca39a1-4f59-4e58-85a9-eb60075647a8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.683297 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.693679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqt6s\" (UniqueName: \"kubernetes.io/projected/6e691fa7-b524-4703-9ba2-9b5d2936deef-kube-api-access-zqt6s\") pod \"cert-manager-webhook-5655c58dd6-gs5vd\" (UID: \"6e691fa7-b524-4703-9ba2-9b5d2936deef\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.696913 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-z26nl" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.719726 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqt6s\" (UniqueName: \"kubernetes.io/projected/6e691fa7-b524-4703-9ba2-9b5d2936deef-kube-api-access-zqt6s\") pod \"cert-manager-webhook-5655c58dd6-gs5vd\" (UID: \"6e691fa7-b524-4703-9ba2-9b5d2936deef\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:29 crc kubenswrapper[4760]: I1204 12:26:29.722371 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.074573 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-z26nl"] Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.083097 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.214523 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q62nq"] Dec 04 12:26:30 crc kubenswrapper[4760]: W1204 12:26:30.220445 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6ca39a1_4f59_4e58_85a9_eb60075647a8.slice/crio-64aadc4934d6335dde36b9673888f0f1adb4e5653e0fe0adc1c7cc277046fd07 WatchSource:0}: Error finding container 64aadc4934d6335dde36b9673888f0f1adb4e5653e0fe0adc1c7cc277046fd07: Status 404 returned error can't find the container with id 64aadc4934d6335dde36b9673888f0f1adb4e5653e0fe0adc1c7cc277046fd07 Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.266496 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" event={"ID":"e6ca39a1-4f59-4e58-85a9-eb60075647a8","Type":"ContainerStarted","Data":"64aadc4934d6335dde36b9673888f0f1adb4e5653e0fe0adc1c7cc277046fd07"} Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.268352 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-z26nl" event={"ID":"7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc","Type":"ContainerStarted","Data":"fbdd8aefddf0dc71ca3489192e31410f02f52f8e1c15f51166139f1d2abb360d"} Dec 04 12:26:30 crc kubenswrapper[4760]: I1204 12:26:30.316626 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gs5vd"] Dec 04 12:26:30 crc kubenswrapper[4760]: W1204 12:26:30.319733 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e691fa7_b524_4703_9ba2_9b5d2936deef.slice/crio-03f8fe75640fb8b8b09fdc27494b06523029246fe5151f5596635d82a4129c6f WatchSource:0}: Error finding container 03f8fe75640fb8b8b09fdc27494b06523029246fe5151f5596635d82a4129c6f: Status 404 returned error can't find the container with id 03f8fe75640fb8b8b09fdc27494b06523029246fe5151f5596635d82a4129c6f Dec 04 12:26:31 crc kubenswrapper[4760]: I1204 12:26:31.277283 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" event={"ID":"6e691fa7-b524-4703-9ba2-9b5d2936deef","Type":"ContainerStarted","Data":"03f8fe75640fb8b8b09fdc27494b06523029246fe5151f5596635d82a4129c6f"} Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.295430 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" event={"ID":"e6ca39a1-4f59-4e58-85a9-eb60075647a8","Type":"ContainerStarted","Data":"ae21150aa6050b9eb012ed0d1a675430d1419bea6fac1d763acfd6cbc74ee28e"} Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.297925 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-z26nl" event={"ID":"7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc","Type":"ContainerStarted","Data":"d352864647b7f0ba327a15e3ff6c3a45e79360b109649f72723c7785a1b3514d"} Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.318324 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-q62nq" podStartSLOduration=1.768449646 podStartE2EDuration="4.318302165s" podCreationTimestamp="2025-12-04 12:26:29 +0000 UTC" firstStartedPulling="2025-12-04 12:26:30.223235704 +0000 UTC m=+793.264682271" lastFinishedPulling="2025-12-04 12:26:32.773088223 +0000 UTC m=+795.814534790" observedRunningTime="2025-12-04 12:26:33.315173186 +0000 UTC m=+796.356619753" watchObservedRunningTime="2025-12-04 12:26:33.318302165 +0000 UTC m=+796.359748732" Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.334015 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-z26nl" podStartSLOduration=1.641539747 podStartE2EDuration="4.333994574s" podCreationTimestamp="2025-12-04 12:26:29 +0000 UTC" firstStartedPulling="2025-12-04 12:26:30.082830195 +0000 UTC m=+793.124276762" lastFinishedPulling="2025-12-04 12:26:32.775285032 +0000 UTC m=+795.816731589" observedRunningTime="2025-12-04 12:26:33.332515777 +0000 UTC m=+796.373962354" watchObservedRunningTime="2025-12-04 12:26:33.333994574 +0000 UTC m=+796.375441141" Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.381384 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.381490 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.381577 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.382721 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:26:33 crc kubenswrapper[4760]: I1204 12:26:33.382805 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba" gracePeriod=600 Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.305761 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" event={"ID":"6e691fa7-b524-4703-9ba2-9b5d2936deef","Type":"ContainerStarted","Data":"6658d9c78c738404dac4d9b1453c5118085b8edd50fa1273317f9558a8c5759b"} Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.306971 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.311069 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba" exitCode=0 Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.311142 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba"} Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.311825 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a"} Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.311848 4760 scope.go:117] "RemoveContainer" containerID="1c74be0dd0598690df2185325325d6e89258dfec4d55d769746cb995ea68c9dd" Dec 04 12:26:34 crc kubenswrapper[4760]: I1204 12:26:34.325896 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" podStartSLOduration=1.6548844790000001 podStartE2EDuration="5.3258714s" podCreationTimestamp="2025-12-04 12:26:29 +0000 UTC" firstStartedPulling="2025-12-04 12:26:30.322044201 +0000 UTC m=+793.363490768" lastFinishedPulling="2025-12-04 12:26:33.993031122 +0000 UTC m=+797.034477689" observedRunningTime="2025-12-04 12:26:34.322326469 +0000 UTC m=+797.363773036" watchObservedRunningTime="2025-12-04 12:26:34.3258714 +0000 UTC m=+797.367317967" Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.725508 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-gs5vd" Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803282 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q8b49"] Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803652 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-controller" containerID="cri-o://bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803680 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="sbdb" containerID="cri-o://3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803771 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-node" containerID="cri-o://1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803739 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803791 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="nbdb" containerID="cri-o://9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803812 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="northd" containerID="cri-o://550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.803845 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-acl-logging" containerID="cri-o://0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c" gracePeriod=30 Dec 04 12:26:39 crc kubenswrapper[4760]: I1204 12:26:39.850871 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" containerID="cri-o://57ad450cca5fc659a67cf074a73ec88c6bc591926b1636f59908fba3d9f25a69" gracePeriod=30 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.358342 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovnkube-controller/3.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.362167 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-acl-logging/0.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.363059 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-controller/0.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364242 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="57ad450cca5fc659a67cf074a73ec88c6bc591926b1636f59908fba3d9f25a69" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364392 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364473 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364545 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364246 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"57ad450cca5fc659a67cf074a73ec88c6bc591926b1636f59908fba3d9f25a69"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364615 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364666 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364687 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364689 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329" exitCode=0 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364702 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364710 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c" exitCode=143 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364717 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364723 4760 generic.go:334] "Generic (PLEG): container finished" podID="69907424-ac0b-4430-b508-af165754104f" containerID="bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878" exitCode=143 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364730 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364743 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364755 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.364800 4760 scope.go:117] "RemoveContainer" containerID="6de3638c8fbd065f82d6722aae7cd2df32883d73ad13381178347a76fb99d8f0" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.368193 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/2.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.368830 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/1.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.368887 4760 generic.go:334] "Generic (PLEG): container finished" podID="017b9fc1-6db4-4786-81f1-6cb9b09c90a3" containerID="22849b4c74cfeea314c9800b164c42a8941c66b08bb09b8eea11b3bdd74ec348" exitCode=2 Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.368945 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerDied","Data":"22849b4c74cfeea314c9800b164c42a8941c66b08bb09b8eea11b3bdd74ec348"} Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.369860 4760 scope.go:117] "RemoveContainer" containerID="22849b4c74cfeea314c9800b164c42a8941c66b08bb09b8eea11b3bdd74ec348" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.401272 4760 scope.go:117] "RemoveContainer" containerID="d330dda0e3d1027f89b70d9065b1a4c83152c3a5031b612c310878b463f9b887" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.639933 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-acl-logging/0.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.641982 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-controller/0.log" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.642806 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704630 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2xnb5"] Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704873 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kubecfg-setup" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704887 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kubecfg-setup" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704895 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-acl-logging" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704902 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-acl-logging" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704914 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704921 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704932 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704940 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704953 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704961 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704973 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-node" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.704982 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-node" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.704995 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705003 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705010 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705015 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705026 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705032 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705039 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705047 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705057 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="northd" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705086 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="northd" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705093 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="nbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705100 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="nbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: E1204 12:26:40.705112 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="sbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705118 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="sbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705239 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705249 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705255 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705264 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="nbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705274 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705283 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="northd" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705290 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="sbdb" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705297 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705303 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovn-acl-logging" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705310 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="kube-rbac-proxy-node" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705318 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.705502 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="69907424-ac0b-4430-b508-af165754104f" containerName="ovnkube-controller" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.707171 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.744986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745093 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745111 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745173 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745202 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745261 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745275 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745271 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745312 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745381 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745430 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745454 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745497 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745528 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket" (OuterVolumeSpecName: "log-socket") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745537 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745580 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash" (OuterVolumeSpecName: "host-slash") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745587 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hvpf\" (UniqueName: \"kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745671 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745698 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745731 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745764 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745842 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745919 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745948 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.745965 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch\") pod \"69907424-ac0b-4430-b508-af165754104f\" (UID: \"69907424-ac0b-4430-b508-af165754104f\") " Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.746224 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.746891 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.746930 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.746955 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.746987 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747305 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747391 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log" (OuterVolumeSpecName: "node-log") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747419 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747530 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747902 4760 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747933 4760 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747951 4760 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747968 4760 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747981 4760 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.747994 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748006 4760 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748018 4760 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748028 4760 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748041 4760 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748053 4760 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748065 4760 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748077 4760 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748087 4760 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748140 4760 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748155 4760 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.748168 4760 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69907424-ac0b-4430-b508-af165754104f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.753818 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf" (OuterVolumeSpecName: "kube-api-access-7hvpf") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "kube-api-access-7hvpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.754943 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.779465 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "69907424-ac0b-4430-b508-af165754104f" (UID: "69907424-ac0b-4430-b508-af165754104f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.849779 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-kubelet\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.849867 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-log-socket\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.849911 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850395 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-slash\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850449 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-systemd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850523 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-netns\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850549 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-var-lib-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850603 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovn-node-metrics-cert\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850642 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-etc-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850677 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-config\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850764 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-node-log\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850798 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-systemd-units\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850833 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-script-lib\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850865 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850898 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-ovn\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850945 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7f4\" (UniqueName: \"kubernetes.io/projected/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-kube-api-access-9f7f4\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.850979 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851028 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-env-overrides\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851064 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-netd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851244 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-bin\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851333 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hvpf\" (UniqueName: \"kubernetes.io/projected/69907424-ac0b-4430-b508-af165754104f-kube-api-access-7hvpf\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851351 4760 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69907424-ac0b-4430-b508-af165754104f-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.851367 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69907424-ac0b-4430-b508-af165754104f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952722 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-bin\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-kubelet\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952849 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-log-socket\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952895 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952946 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-slash\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952969 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-systemd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952996 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-netns\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952996 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-kubelet\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953036 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-systemd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953107 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-netns\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953051 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-var-lib-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.952993 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-log-socket\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-slash\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953231 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-bin\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953241 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-var-lib-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953403 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovn-node-metrics-cert\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953487 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-etc-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953541 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-config\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-node-log\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953611 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-etc-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953616 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-systemd-units\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953681 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-systemd-units\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953696 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-script-lib\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953746 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-ovn\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953771 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7f4\" (UniqueName: \"kubernetes.io/projected/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-kube-api-access-9f7f4\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953850 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953906 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-netd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.953933 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-env-overrides\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.954257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-run-ovn-kubernetes\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.954967 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-env-overrides\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955080 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-config\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955139 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-node-log\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955171 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-openvswitch\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955198 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-run-ovn\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955259 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-host-cni-netd\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.955600 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovnkube-script-lib\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.956961 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-ovn-node-metrics-cert\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:40 crc kubenswrapper[4760]: I1204 12:26:40.977931 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7f4\" (UniqueName: \"kubernetes.io/projected/f991a58a-a7fc-48e5-a1d1-f2389e59abe9-kube-api-access-9f7f4\") pod \"ovnkube-node-2xnb5\" (UID: \"f991a58a-a7fc-48e5-a1d1-f2389e59abe9\") " pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.022887 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.375761 4760 generic.go:334] "Generic (PLEG): container finished" podID="f991a58a-a7fc-48e5-a1d1-f2389e59abe9" containerID="b5992560a1cca0204996d3c09b87b245e610b23e246a12348b6347c447925c85" exitCode=0 Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.375857 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerDied","Data":"b5992560a1cca0204996d3c09b87b245e610b23e246a12348b6347c447925c85"} Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.375905 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"068821de311f0c31cd43094ad429cd13808e9b9873f261cdd912e3ec2fef2ff8"} Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.382524 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-acl-logging/0.log" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.383103 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-q8b49_69907424-ac0b-4430-b508-af165754104f/ovn-controller/0.log" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.385527 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" event={"ID":"69907424-ac0b-4430-b508-af165754104f","Type":"ContainerDied","Data":"9f748fec3a4fc962610a6231c9856a92f74d102c9946e071562b2bc03e3c1e8b"} Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.385615 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-q8b49" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.385595 4760 scope.go:117] "RemoveContainer" containerID="57ad450cca5fc659a67cf074a73ec88c6bc591926b1636f59908fba3d9f25a69" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.389656 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dg5hd_017b9fc1-6db4-4786-81f1-6cb9b09c90a3/kube-multus/2.log" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.389722 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dg5hd" event={"ID":"017b9fc1-6db4-4786-81f1-6cb9b09c90a3","Type":"ContainerStarted","Data":"96b2d314b5795ca8c215cfaa423083c932dc1d5d6b247c13440e7b53ec829c6a"} Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.426139 4760 scope.go:117] "RemoveContainer" containerID="3209a8b44cb0fa510a26e4c11787027c519e274462f83696effadc47c770589e" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.450074 4760 scope.go:117] "RemoveContainer" containerID="9156fd4c1bc4ece9c3c922aec3caad8fc4c101f67eaaf2eb0b88f71ac8c90007" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.478939 4760 scope.go:117] "RemoveContainer" containerID="550d0890d6cdc6b358e20152e02247d5feaf83e33ea1fde7c871b8e6d1e22c19" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.479724 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q8b49"] Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.485542 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-q8b49"] Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.494266 4760 scope.go:117] "RemoveContainer" containerID="7124b7eecd7ae7750b437bd72df1ca237cee9474c607172ae69de51e876205ac" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.510565 4760 scope.go:117] "RemoveContainer" containerID="1bbd35ddf50ca228383278b7ada05642c637e60e1dc74f51e26e815de3722329" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.532394 4760 scope.go:117] "RemoveContainer" containerID="0d46db5ef4bf5053b11f3dfaf44dc4a7ae9e139ec6fca34ee3b66d5de7228d2c" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.551150 4760 scope.go:117] "RemoveContainer" containerID="bcad0b3103bd95df94c6953069d2d6eda62cdaa1dc2749270744a96f328cc878" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.572552 4760 scope.go:117] "RemoveContainer" containerID="13c9fc247f2dd7cd5d19c7c9a49518fc9fc1512e4aa916986e86cd4f61465e83" Dec 04 12:26:41 crc kubenswrapper[4760]: I1204 12:26:41.912566 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69907424-ac0b-4430-b508-af165754104f" path="/var/lib/kubelet/pods/69907424-ac0b-4430-b508-af165754104f/volumes" Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.398650 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"79f80decf93fefa05c906741d840c105a89ae96c0161adcbd51908d7d4eea532"} Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.399017 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"e04bb88b73021abf5ad108b1c673fe3883ff445814705ffc8512bb397f986407"} Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.399039 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"64903b5f7f07037eae65e802beaf23fb9aea78398dc0fb0953582f2ec2f1236f"} Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.399054 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"b4579aa67f75cba2c1c1d52e80f2b1e527241e08d7f793d794bf7733b1f7e2d4"} Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.399069 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"ff015d24c7af9058589076fdfa8a4088e05383dc668fa7ac73028df793c7e30a"} Dec 04 12:26:42 crc kubenswrapper[4760]: I1204 12:26:42.399082 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"200d00e79339dbc006d3b0cf3effcd76b4bc4b590dbd1c8695148db826d7dd93"} Dec 04 12:26:44 crc kubenswrapper[4760]: I1204 12:26:44.431203 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"e2ddfd1a37f262ba17148d7487c33c2783967c975a8430c877261f9ddf6c5097"} Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.506711 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" event={"ID":"f991a58a-a7fc-48e5-a1d1-f2389e59abe9","Type":"ContainerStarted","Data":"f1de340fbf3c82e6ece77ae7a293539c22be46b7007681e6f6c9d8b7487ae6da"} Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.508783 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.508811 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.508823 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.552695 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" podStartSLOduration=8.552672377 podStartE2EDuration="8.552672377s" podCreationTimestamp="2025-12-04 12:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:26:48.551021644 +0000 UTC m=+811.592468231" watchObservedRunningTime="2025-12-04 12:26:48.552672377 +0000 UTC m=+811.594118944" Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.555337 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:26:48 crc kubenswrapper[4760]: I1204 12:26:48.565511 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:27:04 crc kubenswrapper[4760]: I1204 12:27:04.894704 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Dec 04 12:27:04 crc kubenswrapper[4760]: I1204 12:27:04.897025 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Dec 04 12:27:04 crc kubenswrapper[4760]: I1204 12:27:04.899926 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 12:27:04 crc kubenswrapper[4760]: I1204 12:27:04.899993 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 12:27:04 crc kubenswrapper[4760]: I1204 12:27:04.900029 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-9w4ks" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.060244 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-run\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.060666 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-log\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.060691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jrd\" (UniqueName: \"kubernetes.io/projected/ffee0fcf-4f0c-4471-8b39-1762da661157-kube-api-access-z7jrd\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.060727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-data\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.162094 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-run\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.162567 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-log\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.162651 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jrd\" (UniqueName: \"kubernetes.io/projected/ffee0fcf-4f0c-4471-8b39-1762da661157-kube-api-access-z7jrd\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.162740 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-data\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.162936 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-run\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.163236 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-log\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.163447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ffee0fcf-4f0c-4471-8b39-1762da661157-data\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.187433 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jrd\" (UniqueName: \"kubernetes.io/projected/ffee0fcf-4f0c-4471-8b39-1762da661157-kube-api-access-z7jrd\") pod \"ceph\" (UID: \"ffee0fcf-4f0c-4471-8b39-1762da661157\") " pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.218089 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Dec 04 12:27:05 crc kubenswrapper[4760]: W1204 12:27:05.249977 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffee0fcf_4f0c_4471_8b39_1762da661157.slice/crio-d5d74733c8a1d088e1a50f874391e8719a44f966692fd58defb9ed192e0372b7 WatchSource:0}: Error finding container d5d74733c8a1d088e1a50f874391e8719a44f966692fd58defb9ed192e0372b7: Status 404 returned error can't find the container with id d5d74733c8a1d088e1a50f874391e8719a44f966692fd58defb9ed192e0372b7 Dec 04 12:27:05 crc kubenswrapper[4760]: I1204 12:27:05.602595 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"ffee0fcf-4f0c-4471-8b39-1762da661157","Type":"ContainerStarted","Data":"d5d74733c8a1d088e1a50f874391e8719a44f966692fd58defb9ed192e0372b7"} Dec 04 12:27:11 crc kubenswrapper[4760]: I1204 12:27:11.048994 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2xnb5" Dec 04 12:27:24 crc kubenswrapper[4760]: E1204 12:27:24.116690 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Dec 04 12:27:24 crc kubenswrapper[4760]: E1204 12:27:24.118392 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7jrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(ffee0fcf-4f0c-4471-8b39-1762da661157): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:27:24 crc kubenswrapper[4760]: E1204 12:27:24.119720 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="ffee0fcf-4f0c-4471-8b39-1762da661157" Dec 04 12:27:24 crc kubenswrapper[4760]: E1204 12:27:24.742043 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="ffee0fcf-4f0c-4471-8b39-1762da661157" Dec 04 12:27:40 crc kubenswrapper[4760]: I1204 12:27:40.835747 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"ffee0fcf-4f0c-4471-8b39-1762da661157","Type":"ContainerStarted","Data":"f7193f7d0f9d64b24f2d1411159a390e128a8fab5e2e5cce58a85d43899a5d5a"} Dec 04 12:27:40 crc kubenswrapper[4760]: I1204 12:27:40.860592 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=2.250511418 podStartE2EDuration="36.86057206s" podCreationTimestamp="2025-12-04 12:27:04 +0000 UTC" firstStartedPulling="2025-12-04 12:27:05.251636323 +0000 UTC m=+828.293082890" lastFinishedPulling="2025-12-04 12:27:39.861696965 +0000 UTC m=+862.903143532" observedRunningTime="2025-12-04 12:27:40.855557181 +0000 UTC m=+863.897003758" watchObservedRunningTime="2025-12-04 12:27:40.86057206 +0000 UTC m=+863.902018627" Dec 04 12:28:07 crc kubenswrapper[4760]: E1204 12:28:07.371162 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.107:34682->38.102.83.107:35609: write tcp 38.102.83.107:34682->38.102.83.107:35609: write: broken pipe Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.053524 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5"] Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.055719 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.058652 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.061878 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5"] Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.140684 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.140751 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gshp\" (UniqueName: \"kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.140892 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.241884 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.241994 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gshp\" (UniqueName: \"kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.242042 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.242559 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.242664 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.260495 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gshp\" (UniqueName: \"kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.371933 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:28:57 crc kubenswrapper[4760]: I1204 12:28:57.625109 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5"] Dec 04 12:28:58 crc kubenswrapper[4760]: I1204 12:28:58.310613 4760 generic.go:334] "Generic (PLEG): container finished" podID="1cc8460c-742b-4533-a26f-225de9c85310" containerID="3131f0c9b46dc3aa72fb9537c99d48d50f3c4c0403dd66c98eac5d63166a50b1" exitCode=0 Dec 04 12:28:58 crc kubenswrapper[4760]: I1204 12:28:58.310659 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" event={"ID":"1cc8460c-742b-4533-a26f-225de9c85310","Type":"ContainerDied","Data":"3131f0c9b46dc3aa72fb9537c99d48d50f3c4c0403dd66c98eac5d63166a50b1"} Dec 04 12:28:58 crc kubenswrapper[4760]: I1204 12:28:58.310965 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" event={"ID":"1cc8460c-742b-4533-a26f-225de9c85310","Type":"ContainerStarted","Data":"6083ced5cb5b6aeed134ee0fda3a9581a97d668886c387147bd6ff8a6d31693b"} Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.209650 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.210929 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.241384 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.269594 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grhv2\" (UniqueName: \"kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.269663 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.269937 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.371409 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.371526 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grhv2\" (UniqueName: \"kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.371556 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.372100 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.372777 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.395598 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grhv2\" (UniqueName: \"kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2\") pod \"redhat-operators-98kd7\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.536904 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:28:59 crc kubenswrapper[4760]: I1204 12:28:59.758245 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:28:59 crc kubenswrapper[4760]: W1204 12:28:59.764471 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbf3914_07b6_495a_a246_69913972a047.slice/crio-f6afec51259b3c06ea288e4d12750af69759362529862574fa3e483ab496bed8 WatchSource:0}: Error finding container f6afec51259b3c06ea288e4d12750af69759362529862574fa3e483ab496bed8: Status 404 returned error can't find the container with id f6afec51259b3c06ea288e4d12750af69759362529862574fa3e483ab496bed8 Dec 04 12:29:00 crc kubenswrapper[4760]: I1204 12:29:00.323285 4760 generic.go:334] "Generic (PLEG): container finished" podID="6dbf3914-07b6-495a-a246-69913972a047" containerID="8ccfd9ebbfb997c75afc573f63929bcbcd1fbe122a5a4ecc68a869b28a25a6e9" exitCode=0 Dec 04 12:29:00 crc kubenswrapper[4760]: I1204 12:29:00.323337 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerDied","Data":"8ccfd9ebbfb997c75afc573f63929bcbcd1fbe122a5a4ecc68a869b28a25a6e9"} Dec 04 12:29:00 crc kubenswrapper[4760]: I1204 12:29:00.323694 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerStarted","Data":"f6afec51259b3c06ea288e4d12750af69759362529862574fa3e483ab496bed8"} Dec 04 12:29:00 crc kubenswrapper[4760]: I1204 12:29:00.326195 4760 generic.go:334] "Generic (PLEG): container finished" podID="1cc8460c-742b-4533-a26f-225de9c85310" containerID="48df39b827acf89fc8084300e28ee74e2824b2b336122346b87bb154c15cdda8" exitCode=0 Dec 04 12:29:00 crc kubenswrapper[4760]: I1204 12:29:00.326273 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" event={"ID":"1cc8460c-742b-4533-a26f-225de9c85310","Type":"ContainerDied","Data":"48df39b827acf89fc8084300e28ee74e2824b2b336122346b87bb154c15cdda8"} Dec 04 12:29:01 crc kubenswrapper[4760]: I1204 12:29:01.335344 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerStarted","Data":"8782a5459e11da009c09fdff20a53c17b525af474b129811c72ebb8f603050f1"} Dec 04 12:29:01 crc kubenswrapper[4760]: I1204 12:29:01.337567 4760 generic.go:334] "Generic (PLEG): container finished" podID="1cc8460c-742b-4533-a26f-225de9c85310" containerID="2eb3d9b5b902e31c258af0f16b42c7b353de6c40ec988a9619bec5f8ce7cfe54" exitCode=0 Dec 04 12:29:01 crc kubenswrapper[4760]: I1204 12:29:01.337615 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" event={"ID":"1cc8460c-742b-4533-a26f-225de9c85310","Type":"ContainerDied","Data":"2eb3d9b5b902e31c258af0f16b42c7b353de6c40ec988a9619bec5f8ce7cfe54"} Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.346477 4760 generic.go:334] "Generic (PLEG): container finished" podID="6dbf3914-07b6-495a-a246-69913972a047" containerID="8782a5459e11da009c09fdff20a53c17b525af474b129811c72ebb8f603050f1" exitCode=0 Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.346535 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerDied","Data":"8782a5459e11da009c09fdff20a53c17b525af474b129811c72ebb8f603050f1"} Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.599173 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.718283 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gshp\" (UniqueName: \"kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp\") pod \"1cc8460c-742b-4533-a26f-225de9c85310\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.718403 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util\") pod \"1cc8460c-742b-4533-a26f-225de9c85310\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.718456 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle\") pod \"1cc8460c-742b-4533-a26f-225de9c85310\" (UID: \"1cc8460c-742b-4533-a26f-225de9c85310\") " Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.719061 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle" (OuterVolumeSpecName: "bundle") pod "1cc8460c-742b-4533-a26f-225de9c85310" (UID: "1cc8460c-742b-4533-a26f-225de9c85310"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.724837 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp" (OuterVolumeSpecName: "kube-api-access-2gshp") pod "1cc8460c-742b-4533-a26f-225de9c85310" (UID: "1cc8460c-742b-4533-a26f-225de9c85310"). InnerVolumeSpecName "kube-api-access-2gshp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.733591 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util" (OuterVolumeSpecName: "util") pod "1cc8460c-742b-4533-a26f-225de9c85310" (UID: "1cc8460c-742b-4533-a26f-225de9c85310"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.819611 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-util\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.819671 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1cc8460c-742b-4533-a26f-225de9c85310-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:02 crc kubenswrapper[4760]: I1204 12:29:02.819685 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gshp\" (UniqueName: \"kubernetes.io/projected/1cc8460c-742b-4533-a26f-225de9c85310-kube-api-access-2gshp\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.354807 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerStarted","Data":"cdaa2f33400f470cfefe5900878d55b5ef09885a03bfe534ed3f32f5bea84c66"} Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.357937 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" event={"ID":"1cc8460c-742b-4533-a26f-225de9c85310","Type":"ContainerDied","Data":"6083ced5cb5b6aeed134ee0fda3a9581a97d668886c387147bd6ff8a6d31693b"} Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.357996 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6083ced5cb5b6aeed134ee0fda3a9581a97d668886c387147bd6ff8a6d31693b" Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.358121 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5" Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.376793 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98kd7" podStartSLOduration=1.957091477 podStartE2EDuration="4.376769103s" podCreationTimestamp="2025-12-04 12:28:59 +0000 UTC" firstStartedPulling="2025-12-04 12:29:00.324843033 +0000 UTC m=+943.366289600" lastFinishedPulling="2025-12-04 12:29:02.744520659 +0000 UTC m=+945.785967226" observedRunningTime="2025-12-04 12:29:03.374426379 +0000 UTC m=+946.415872956" watchObservedRunningTime="2025-12-04 12:29:03.376769103 +0000 UTC m=+946.418215680" Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.380828 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:29:03 crc kubenswrapper[4760]: I1204 12:29:03.380900 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.823803 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz"] Dec 04 12:29:04 crc kubenswrapper[4760]: E1204 12:29:04.824168 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="util" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.824186 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="util" Dec 04 12:29:04 crc kubenswrapper[4760]: E1204 12:29:04.824197 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="pull" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.824203 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="pull" Dec 04 12:29:04 crc kubenswrapper[4760]: E1204 12:29:04.824231 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="extract" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.824238 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="extract" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.824363 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc8460c-742b-4533-a26f-225de9c85310" containerName="extract" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.824913 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.832752 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.833301 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.843030 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz"] Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.848702 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4tg6k" Dec 04 12:29:04 crc kubenswrapper[4760]: I1204 12:29:04.952991 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzmzf\" (UniqueName: \"kubernetes.io/projected/a753604d-ead8-4550-be02-3a5ae4827390-kube-api-access-nzmzf\") pod \"nmstate-operator-5b5b58f5c8-vv8cz\" (UID: \"a753604d-ead8-4550-be02-3a5ae4827390\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" Dec 04 12:29:05 crc kubenswrapper[4760]: I1204 12:29:05.055197 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzmzf\" (UniqueName: \"kubernetes.io/projected/a753604d-ead8-4550-be02-3a5ae4827390-kube-api-access-nzmzf\") pod \"nmstate-operator-5b5b58f5c8-vv8cz\" (UID: \"a753604d-ead8-4550-be02-3a5ae4827390\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" Dec 04 12:29:05 crc kubenswrapper[4760]: I1204 12:29:05.078177 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzmzf\" (UniqueName: \"kubernetes.io/projected/a753604d-ead8-4550-be02-3a5ae4827390-kube-api-access-nzmzf\") pod \"nmstate-operator-5b5b58f5c8-vv8cz\" (UID: \"a753604d-ead8-4550-be02-3a5ae4827390\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" Dec 04 12:29:05 crc kubenswrapper[4760]: I1204 12:29:05.148173 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" Dec 04 12:29:05 crc kubenswrapper[4760]: I1204 12:29:05.442660 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz"] Dec 04 12:29:06 crc kubenswrapper[4760]: I1204 12:29:06.378008 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" event={"ID":"a753604d-ead8-4550-be02-3a5ae4827390","Type":"ContainerStarted","Data":"538bbd2d5679bf88e28381cd5c0e95fe7f0451c26c56a774c128a834458b6342"} Dec 04 12:29:09 crc kubenswrapper[4760]: I1204 12:29:09.405671 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" event={"ID":"a753604d-ead8-4550-be02-3a5ae4827390","Type":"ContainerStarted","Data":"e3d5a54982f463da600557a4ff078a6a821c06d8dbcddb2a64b15ee07b8377e7"} Dec 04 12:29:09 crc kubenswrapper[4760]: I1204 12:29:09.434041 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-vv8cz" podStartSLOduration=2.589730705 podStartE2EDuration="5.434005432s" podCreationTimestamp="2025-12-04 12:29:04 +0000 UTC" firstStartedPulling="2025-12-04 12:29:05.526801807 +0000 UTC m=+948.568248374" lastFinishedPulling="2025-12-04 12:29:08.371076524 +0000 UTC m=+951.412523101" observedRunningTime="2025-12-04 12:29:09.430883673 +0000 UTC m=+952.472330250" watchObservedRunningTime="2025-12-04 12:29:09.434005432 +0000 UTC m=+952.475451999" Dec 04 12:29:09 crc kubenswrapper[4760]: I1204 12:29:09.537360 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:09 crc kubenswrapper[4760]: I1204 12:29:09.537495 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:09 crc kubenswrapper[4760]: I1204 12:29:09.580551 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.381731 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.383317 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.385838 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-s6bxs" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.402187 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.402996 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.405279 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.424396 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.432523 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mkwd\" (UniqueName: \"kubernetes.io/projected/e4f6d4b1-9f69-4970-a2d0-141049cbee82-kube-api-access-9mkwd\") pod \"nmstate-metrics-7f946cbc9-6wct8\" (UID: \"e4f6d4b1-9f69-4970-a2d0-141049cbee82\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.432658 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9bf818f-d737-4114-a5c3-003834179d27-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.432725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh6lx\" (UniqueName: \"kubernetes.io/projected/e9bf818f-d737-4114-a5c3-003834179d27-kube-api-access-dh6lx\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.448103 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wvpfn"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.449090 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.479531 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.494839 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554196 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9bf818f-d737-4114-a5c3-003834179d27-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554299 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-ovs-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554331 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh6lx\" (UniqueName: \"kubernetes.io/projected/e9bf818f-d737-4114-a5c3-003834179d27-kube-api-access-dh6lx\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-nmstate-lock\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554456 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-dbus-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554523 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mkwd\" (UniqueName: \"kubernetes.io/projected/e4f6d4b1-9f69-4970-a2d0-141049cbee82-kube-api-access-9mkwd\") pod \"nmstate-metrics-7f946cbc9-6wct8\" (UID: \"e4f6d4b1-9f69-4970-a2d0-141049cbee82\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.554551 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbp4\" (UniqueName: \"kubernetes.io/projected/491432c2-b909-4092-a693-409b65208f85-kube-api-access-7gbp4\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.561129 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9bf818f-d737-4114-a5c3-003834179d27-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.576959 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mkwd\" (UniqueName: \"kubernetes.io/projected/e4f6d4b1-9f69-4970-a2d0-141049cbee82-kube-api-access-9mkwd\") pod \"nmstate-metrics-7f946cbc9-6wct8\" (UID: \"e4f6d4b1-9f69-4970-a2d0-141049cbee82\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.584842 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.585668 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.588535 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.588874 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.588961 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-264dg" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.596578 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.599822 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh6lx\" (UniqueName: \"kubernetes.io/projected/e9bf818f-d737-4114-a5c3-003834179d27-kube-api-access-dh6lx\") pod \"nmstate-webhook-5f6d4c5ccb-lmqv2\" (UID: \"e9bf818f-d737-4114-a5c3-003834179d27\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657489 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbp4\" (UniqueName: \"kubernetes.io/projected/491432c2-b909-4092-a693-409b65208f85-kube-api-access-7gbp4\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657595 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac746240-d1e4-4a04-98f1-b22871ca58e4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657649 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac746240-d1e4-4a04-98f1-b22871ca58e4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-ovs-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-nmstate-lock\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657812 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whg9s\" (UniqueName: \"kubernetes.io/projected/ac746240-d1e4-4a04-98f1-b22871ca58e4-kube-api-access-whg9s\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.657860 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-dbus-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.658354 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-dbus-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.659186 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-ovs-socket\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.659404 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/491432c2-b909-4092-a693-409b65208f85-nmstate-lock\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.699433 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.701820 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbp4\" (UniqueName: \"kubernetes.io/projected/491432c2-b909-4092-a693-409b65208f85-kube-api-access-7gbp4\") pod \"nmstate-handler-wvpfn\" (UID: \"491432c2-b909-4092-a693-409b65208f85\") " pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.721271 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.763004 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac746240-d1e4-4a04-98f1-b22871ca58e4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.763646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac746240-d1e4-4a04-98f1-b22871ca58e4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.763883 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whg9s\" (UniqueName: \"kubernetes.io/projected/ac746240-d1e4-4a04-98f1-b22871ca58e4-kube-api-access-whg9s\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.765411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac746240-d1e4-4a04-98f1-b22871ca58e4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.771639 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac746240-d1e4-4a04-98f1-b22871ca58e4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.774883 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-796f78c94d-7cdnw"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.785540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.785979 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.794654 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796f78c94d-7cdnw"] Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.800151 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whg9s\" (UniqueName: \"kubernetes.io/projected/ac746240-d1e4-4a04-98f1-b22871ca58e4-kube-api-access-whg9s\") pod \"nmstate-console-plugin-7fbb5f6569-6hrpm\" (UID: \"ac746240-d1e4-4a04-98f1-b22871ca58e4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: W1204 12:29:10.836954 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491432c2_b909_4092_a693_409b65208f85.slice/crio-2b3a71a94843136db7c567552a2cb948d0e9bb3c67707131887c2a65f921bf80 WatchSource:0}: Error finding container 2b3a71a94843136db7c567552a2cb948d0e9bb3c67707131887c2a65f921bf80: Status 404 returned error can't find the container with id 2b3a71a94843136db7c567552a2cb948d0e9bb3c67707131887c2a65f921bf80 Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.865689 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-trusted-ca-bundle\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.865804 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn978\" (UniqueName: \"kubernetes.io/projected/4e7793af-3473-4794-919f-098f44060026-kube-api-access-vn978\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.866173 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-oauth-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.866240 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.866263 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-oauth-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.866308 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-service-ca\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.866355 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-console-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.934560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn978\" (UniqueName: \"kubernetes.io/projected/4e7793af-3473-4794-919f-098f44060026-kube-api-access-vn978\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967821 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-oauth-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967863 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-oauth-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967884 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-service-ca\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967910 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-console-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.967937 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-trusted-ca-bundle\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.968919 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-trusted-ca-bundle\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.970848 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-oauth-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.970866 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-service-ca\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.971116 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4e7793af-3473-4794-919f-098f44060026-console-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.975049 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-serving-cert\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.976966 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4e7793af-3473-4794-919f-098f44060026-console-oauth-config\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:10 crc kubenswrapper[4760]: I1204 12:29:10.988798 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn978\" (UniqueName: \"kubernetes.io/projected/4e7793af-3473-4794-919f-098f44060026-kube-api-access-vn978\") pod \"console-796f78c94d-7cdnw\" (UID: \"4e7793af-3473-4794-919f-098f44060026\") " pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.018862 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8"] Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.027666 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2"] Dec 04 12:29:11 crc kubenswrapper[4760]: W1204 12:29:11.027957 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f6d4b1_9f69_4970_a2d0_141049cbee82.slice/crio-b179a2783378a7fd7b07a91eb31dcade889aa18de1ce3bc38d7498f2067f07b6 WatchSource:0}: Error finding container b179a2783378a7fd7b07a91eb31dcade889aa18de1ce3bc38d7498f2067f07b6: Status 404 returned error can't find the container with id b179a2783378a7fd7b07a91eb31dcade889aa18de1ce3bc38d7498f2067f07b6 Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.106322 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.176256 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm"] Dec 04 12:29:11 crc kubenswrapper[4760]: W1204 12:29:11.177279 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac746240_d1e4_4a04_98f1_b22871ca58e4.slice/crio-5362305d2a14de1365ea1c17375148fb8c8dc5149a6975ae0243d1484342e11a WatchSource:0}: Error finding container 5362305d2a14de1365ea1c17375148fb8c8dc5149a6975ae0243d1484342e11a: Status 404 returned error can't find the container with id 5362305d2a14de1365ea1c17375148fb8c8dc5149a6975ae0243d1484342e11a Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.327436 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796f78c94d-7cdnw"] Dec 04 12:29:11 crc kubenswrapper[4760]: W1204 12:29:11.339917 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7793af_3473_4794_919f_098f44060026.slice/crio-cbbcd17742c0a4594c2af09930fc32a7b0f9d9e94b60b6deeaf0ccdcf30ed786 WatchSource:0}: Error finding container cbbcd17742c0a4594c2af09930fc32a7b0f9d9e94b60b6deeaf0ccdcf30ed786: Status 404 returned error can't find the container with id cbbcd17742c0a4594c2af09930fc32a7b0f9d9e94b60b6deeaf0ccdcf30ed786 Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.425128 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" event={"ID":"e4f6d4b1-9f69-4970-a2d0-141049cbee82","Type":"ContainerStarted","Data":"b179a2783378a7fd7b07a91eb31dcade889aa18de1ce3bc38d7498f2067f07b6"} Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.427805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" event={"ID":"ac746240-d1e4-4a04-98f1-b22871ca58e4","Type":"ContainerStarted","Data":"5362305d2a14de1365ea1c17375148fb8c8dc5149a6975ae0243d1484342e11a"} Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.429347 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f78c94d-7cdnw" event={"ID":"4e7793af-3473-4794-919f-098f44060026","Type":"ContainerStarted","Data":"cbbcd17742c0a4594c2af09930fc32a7b0f9d9e94b60b6deeaf0ccdcf30ed786"} Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.430366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvpfn" event={"ID":"491432c2-b909-4092-a693-409b65208f85","Type":"ContainerStarted","Data":"2b3a71a94843136db7c567552a2cb948d0e9bb3c67707131887c2a65f921bf80"} Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.431686 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" event={"ID":"e9bf818f-d737-4114-a5c3-003834179d27","Type":"ContainerStarted","Data":"fd976a48df40267c1837ee50ec9155a4de4f8d5036c05cdaccc90f29bf39e4b5"} Dec 04 12:29:11 crc kubenswrapper[4760]: I1204 12:29:11.997785 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:29:12 crc kubenswrapper[4760]: I1204 12:29:12.443364 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f78c94d-7cdnw" event={"ID":"4e7793af-3473-4794-919f-098f44060026","Type":"ContainerStarted","Data":"f5f7727f72d174c968bb4edfb99fd3457908c7bae81c8f6e697922f13cd1f80d"} Dec 04 12:29:12 crc kubenswrapper[4760]: I1204 12:29:12.443545 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98kd7" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="registry-server" containerID="cri-o://cdaa2f33400f470cfefe5900878d55b5ef09885a03bfe534ed3f32f5bea84c66" gracePeriod=2 Dec 04 12:29:12 crc kubenswrapper[4760]: I1204 12:29:12.485733 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-796f78c94d-7cdnw" podStartSLOduration=2.485709477 podStartE2EDuration="2.485709477s" podCreationTimestamp="2025-12-04 12:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:29:12.479420987 +0000 UTC m=+955.520867574" watchObservedRunningTime="2025-12-04 12:29:12.485709477 +0000 UTC m=+955.527156044" Dec 04 12:29:14 crc kubenswrapper[4760]: I1204 12:29:14.464298 4760 generic.go:334] "Generic (PLEG): container finished" podID="6dbf3914-07b6-495a-a246-69913972a047" containerID="cdaa2f33400f470cfefe5900878d55b5ef09885a03bfe534ed3f32f5bea84c66" exitCode=0 Dec 04 12:29:14 crc kubenswrapper[4760]: I1204 12:29:14.464521 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerDied","Data":"cdaa2f33400f470cfefe5900878d55b5ef09885a03bfe534ed3f32f5bea84c66"} Dec 04 12:29:14 crc kubenswrapper[4760]: I1204 12:29:14.946391 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.039657 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content\") pod \"6dbf3914-07b6-495a-a246-69913972a047\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.039743 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grhv2\" (UniqueName: \"kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2\") pod \"6dbf3914-07b6-495a-a246-69913972a047\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.039780 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities\") pod \"6dbf3914-07b6-495a-a246-69913972a047\" (UID: \"6dbf3914-07b6-495a-a246-69913972a047\") " Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.041019 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities" (OuterVolumeSpecName: "utilities") pod "6dbf3914-07b6-495a-a246-69913972a047" (UID: "6dbf3914-07b6-495a-a246-69913972a047"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.045796 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2" (OuterVolumeSpecName: "kube-api-access-grhv2") pod "6dbf3914-07b6-495a-a246-69913972a047" (UID: "6dbf3914-07b6-495a-a246-69913972a047"). InnerVolumeSpecName "kube-api-access-grhv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.140961 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grhv2\" (UniqueName: \"kubernetes.io/projected/6dbf3914-07b6-495a-a246-69913972a047-kube-api-access-grhv2\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.141001 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.152594 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dbf3914-07b6-495a-a246-69913972a047" (UID: "6dbf3914-07b6-495a-a246-69913972a047"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.242847 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dbf3914-07b6-495a-a246-69913972a047-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.473255 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98kd7" event={"ID":"6dbf3914-07b6-495a-a246-69913972a047","Type":"ContainerDied","Data":"f6afec51259b3c06ea288e4d12750af69759362529862574fa3e483ab496bed8"} Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.473309 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98kd7" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.473336 4760 scope.go:117] "RemoveContainer" containerID="cdaa2f33400f470cfefe5900878d55b5ef09885a03bfe534ed3f32f5bea84c66" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.475200 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wvpfn" event={"ID":"491432c2-b909-4092-a693-409b65208f85","Type":"ContainerStarted","Data":"611854383e35eba54782938ac5a56ec729f60bcdd4c50ff4af1a10e4317a602a"} Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.475721 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.481412 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" event={"ID":"e9bf818f-d737-4114-a5c3-003834179d27","Type":"ContainerStarted","Data":"40a59b9d294c13c3a3fbc744b049510476ad4cce0127137a5ba9743ea916d229"} Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.481548 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.486565 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" event={"ID":"ac746240-d1e4-4a04-98f1-b22871ca58e4","Type":"ContainerStarted","Data":"7a4390507c26c9db05754dc2e322d2daf0ebbfceb03190aec1b09e83f747e9f0"} Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.487947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" event={"ID":"e4f6d4b1-9f69-4970-a2d0-141049cbee82","Type":"ContainerStarted","Data":"d0519a3a044503870e682918bca1a682f2c10134c9deb2d108e85c8822c309e2"} Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.498386 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wvpfn" podStartSLOduration=1.302886143 podStartE2EDuration="5.498368481s" podCreationTimestamp="2025-12-04 12:29:10 +0000 UTC" firstStartedPulling="2025-12-04 12:29:10.840379988 +0000 UTC m=+953.881826555" lastFinishedPulling="2025-12-04 12:29:15.035862326 +0000 UTC m=+958.077308893" observedRunningTime="2025-12-04 12:29:15.498127903 +0000 UTC m=+958.539574470" watchObservedRunningTime="2025-12-04 12:29:15.498368481 +0000 UTC m=+958.539815048" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.506407 4760 scope.go:117] "RemoveContainer" containerID="8782a5459e11da009c09fdff20a53c17b525af474b129811c72ebb8f603050f1" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.512926 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.516786 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98kd7"] Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.525660 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-6hrpm" podStartSLOduration=1.749229132 podStartE2EDuration="5.525637026s" podCreationTimestamp="2025-12-04 12:29:10 +0000 UTC" firstStartedPulling="2025-12-04 12:29:11.179971809 +0000 UTC m=+954.221418376" lastFinishedPulling="2025-12-04 12:29:14.956379703 +0000 UTC m=+957.997826270" observedRunningTime="2025-12-04 12:29:15.524552652 +0000 UTC m=+958.565999220" watchObservedRunningTime="2025-12-04 12:29:15.525637026 +0000 UTC m=+958.567083603" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.551546 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" podStartSLOduration=1.599121986 podStartE2EDuration="5.551523158s" podCreationTimestamp="2025-12-04 12:29:10 +0000 UTC" firstStartedPulling="2025-12-04 12:29:11.054091732 +0000 UTC m=+954.095538299" lastFinishedPulling="2025-12-04 12:29:15.006492904 +0000 UTC m=+958.047939471" observedRunningTime="2025-12-04 12:29:15.544551907 +0000 UTC m=+958.585998474" watchObservedRunningTime="2025-12-04 12:29:15.551523158 +0000 UTC m=+958.592969735" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.553103 4760 scope.go:117] "RemoveContainer" containerID="8ccfd9ebbfb997c75afc573f63929bcbcd1fbe122a5a4ecc68a869b28a25a6e9" Dec 04 12:29:15 crc kubenswrapper[4760]: I1204 12:29:15.870557 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dbf3914-07b6-495a-a246-69913972a047" path="/var/lib/kubelet/pods/6dbf3914-07b6-495a-a246-69913972a047/volumes" Dec 04 12:29:17 crc kubenswrapper[4760]: I1204 12:29:17.503516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" event={"ID":"e4f6d4b1-9f69-4970-a2d0-141049cbee82","Type":"ContainerStarted","Data":"15b2ceb41949d74f8e8046e864531a0a56368753c740838d723c185dd7018374"} Dec 04 12:29:17 crc kubenswrapper[4760]: I1204 12:29:17.521313 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6wct8" podStartSLOduration=1.242001289 podStartE2EDuration="7.521193956s" podCreationTimestamp="2025-12-04 12:29:10 +0000 UTC" firstStartedPulling="2025-12-04 12:29:11.030596597 +0000 UTC m=+954.072043164" lastFinishedPulling="2025-12-04 12:29:17.309789264 +0000 UTC m=+960.351235831" observedRunningTime="2025-12-04 12:29:17.518782269 +0000 UTC m=+960.560228836" watchObservedRunningTime="2025-12-04 12:29:17.521193956 +0000 UTC m=+960.562640533" Dec 04 12:29:20 crc kubenswrapper[4760]: I1204 12:29:20.807903 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wvpfn" Dec 04 12:29:21 crc kubenswrapper[4760]: I1204 12:29:21.106545 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:21 crc kubenswrapper[4760]: I1204 12:29:21.106917 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:21 crc kubenswrapper[4760]: I1204 12:29:21.112327 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:21 crc kubenswrapper[4760]: I1204 12:29:21.532885 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-796f78c94d-7cdnw" Dec 04 12:29:21 crc kubenswrapper[4760]: I1204 12:29:21.622929 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:29:30 crc kubenswrapper[4760]: I1204 12:29:30.731406 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-lmqv2" Dec 04 12:29:33 crc kubenswrapper[4760]: I1204 12:29:33.380310 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:29:33 crc kubenswrapper[4760]: I1204 12:29:33.380916 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.012622 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q"] Dec 04 12:29:45 crc kubenswrapper[4760]: E1204 12:29:45.013411 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="extract-content" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.013425 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="extract-content" Dec 04 12:29:45 crc kubenswrapper[4760]: E1204 12:29:45.013444 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="registry-server" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.013450 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="registry-server" Dec 04 12:29:45 crc kubenswrapper[4760]: E1204 12:29:45.013475 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="extract-utilities" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.013481 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="extract-utilities" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.013572 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dbf3914-07b6-495a-a246-69913972a047" containerName="registry-server" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.014486 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.016658 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.025375 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q"] Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.191914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2p6w\" (UniqueName: \"kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.192316 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.192404 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.294225 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2p6w\" (UniqueName: \"kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.294311 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.294377 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.294871 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.294973 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.315831 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2p6w\" (UniqueName: \"kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.333055 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:45 crc kubenswrapper[4760]: I1204 12:29:45.728688 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q"] Dec 04 12:29:46 crc kubenswrapper[4760]: I1204 12:29:46.661884 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5lxbp" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" containerID="cri-o://aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc" gracePeriod=15 Dec 04 12:29:46 crc kubenswrapper[4760]: I1204 12:29:46.682640 4760 generic.go:334] "Generic (PLEG): container finished" podID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerID="97b32d3f50ab8de61194fc8d851c1fd72e2d727fc5d7249f5170139150ce70b0" exitCode=0 Dec 04 12:29:46 crc kubenswrapper[4760]: I1204 12:29:46.682763 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" event={"ID":"e3a7c8db-106b-48f4-a044-e604a1c6f934","Type":"ContainerDied","Data":"97b32d3f50ab8de61194fc8d851c1fd72e2d727fc5d7249f5170139150ce70b0"} Dec 04 12:29:46 crc kubenswrapper[4760]: I1204 12:29:46.682871 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" event={"ID":"e3a7c8db-106b-48f4-a044-e604a1c6f934","Type":"ContainerStarted","Data":"ad9a1155f3ffc7814c8f67b3cf8b30ce74cc7bcf8eba33a96cda1ebea07eb142"} Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.005541 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5lxbp_8c791b99-3020-4b20-9d91-87c5ba9f615a/console/0.log" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.005625 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120061 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120109 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120131 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120146 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120195 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120237 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwcsq\" (UniqueName: \"kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.120346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config\") pod \"8c791b99-3020-4b20-9d91-87c5ba9f615a\" (UID: \"8c791b99-3020-4b20-9d91-87c5ba9f615a\") " Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.121664 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config" (OuterVolumeSpecName: "console-config") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.121940 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.121967 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.122001 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.127199 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq" (OuterVolumeSpecName: "kube-api-access-gwcsq") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "kube-api-access-gwcsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.127309 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.127444 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c791b99-3020-4b20-9d91-87c5ba9f615a" (UID: "8c791b99-3020-4b20-9d91-87c5ba9f615a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222759 4760 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222801 4760 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222815 4760 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222828 4760 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c791b99-3020-4b20-9d91-87c5ba9f615a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222839 4760 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222850 4760 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c791b99-3020-4b20-9d91-87c5ba9f615a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.222861 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwcsq\" (UniqueName: \"kubernetes.io/projected/8c791b99-3020-4b20-9d91-87c5ba9f615a-kube-api-access-gwcsq\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692030 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5lxbp_8c791b99-3020-4b20-9d91-87c5ba9f615a/console/0.log" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692392 4760 generic.go:334] "Generic (PLEG): container finished" podID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerID="aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc" exitCode=2 Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692437 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5lxbp" event={"ID":"8c791b99-3020-4b20-9d91-87c5ba9f615a","Type":"ContainerDied","Data":"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc"} Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5lxbp" event={"ID":"8c791b99-3020-4b20-9d91-87c5ba9f615a","Type":"ContainerDied","Data":"53643b3b518b2595fe5f8491b9dafc6af6f8618c6efa1202dfc1618c1059b12e"} Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692489 4760 scope.go:117] "RemoveContainer" containerID="aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.692484 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5lxbp" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.725505 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.729508 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5lxbp"] Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.730265 4760 scope.go:117] "RemoveContainer" containerID="aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc" Dec 04 12:29:47 crc kubenswrapper[4760]: E1204 12:29:47.730665 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc\": container with ID starting with aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc not found: ID does not exist" containerID="aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.730705 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc"} err="failed to get container status \"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc\": rpc error: code = NotFound desc = could not find container \"aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc\": container with ID starting with aa420ea4759ffccf68abf8b491c7c365ab752be661efc8dd12dc4e0e23adb6cc not found: ID does not exist" Dec 04 12:29:47 crc kubenswrapper[4760]: I1204 12:29:47.873444 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" path="/var/lib/kubelet/pods/8c791b99-3020-4b20-9d91-87c5ba9f615a/volumes" Dec 04 12:29:48 crc kubenswrapper[4760]: I1204 12:29:48.700710 4760 generic.go:334] "Generic (PLEG): container finished" podID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerID="c2386a1f0d49562a4a365c1f4475a2add0ae71b438ad63329ed2509595aa7481" exitCode=0 Dec 04 12:29:48 crc kubenswrapper[4760]: I1204 12:29:48.700867 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" event={"ID":"e3a7c8db-106b-48f4-a044-e604a1c6f934","Type":"ContainerDied","Data":"c2386a1f0d49562a4a365c1f4475a2add0ae71b438ad63329ed2509595aa7481"} Dec 04 12:29:49 crc kubenswrapper[4760]: I1204 12:29:49.728459 4760 generic.go:334] "Generic (PLEG): container finished" podID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerID="459512fef3f9d838516b25c1fa84bbcf5dc9fac89c82c561b9c08b716e970f63" exitCode=0 Dec 04 12:29:49 crc kubenswrapper[4760]: I1204 12:29:49.728510 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" event={"ID":"e3a7c8db-106b-48f4-a044-e604a1c6f934","Type":"ContainerDied","Data":"459512fef3f9d838516b25c1fa84bbcf5dc9fac89c82c561b9c08b716e970f63"} Dec 04 12:29:50 crc kubenswrapper[4760]: I1204 12:29:50.972613 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.078457 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util\") pod \"e3a7c8db-106b-48f4-a044-e604a1c6f934\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.078551 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle\") pod \"e3a7c8db-106b-48f4-a044-e604a1c6f934\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.078574 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2p6w\" (UniqueName: \"kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w\") pod \"e3a7c8db-106b-48f4-a044-e604a1c6f934\" (UID: \"e3a7c8db-106b-48f4-a044-e604a1c6f934\") " Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.080018 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle" (OuterVolumeSpecName: "bundle") pod "e3a7c8db-106b-48f4-a044-e604a1c6f934" (UID: "e3a7c8db-106b-48f4-a044-e604a1c6f934"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.086318 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w" (OuterVolumeSpecName: "kube-api-access-x2p6w") pod "e3a7c8db-106b-48f4-a044-e604a1c6f934" (UID: "e3a7c8db-106b-48f4-a044-e604a1c6f934"). InnerVolumeSpecName "kube-api-access-x2p6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.180097 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.180141 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2p6w\" (UniqueName: \"kubernetes.io/projected/e3a7c8db-106b-48f4-a044-e604a1c6f934-kube-api-access-x2p6w\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.341896 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util" (OuterVolumeSpecName: "util") pod "e3a7c8db-106b-48f4-a044-e604a1c6f934" (UID: "e3a7c8db-106b-48f4-a044-e604a1c6f934"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.382438 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3a7c8db-106b-48f4-a044-e604a1c6f934-util\") on node \"crc\" DevicePath \"\"" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.760953 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" event={"ID":"e3a7c8db-106b-48f4-a044-e604a1c6f934","Type":"ContainerDied","Data":"ad9a1155f3ffc7814c8f67b3cf8b30ce74cc7bcf8eba33a96cda1ebea07eb142"} Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.760990 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9a1155f3ffc7814c8f67b3cf8b30ce74cc7bcf8eba33a96cda1ebea07eb142" Dec 04 12:29:51 crc kubenswrapper[4760]: I1204 12:29:51.761065 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.051713 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv"] Dec 04 12:30:00 crc kubenswrapper[4760]: E1204 12:30:00.052651 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="util" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052672 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="util" Dec 04 12:30:00 crc kubenswrapper[4760]: E1204 12:30:00.052691 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="pull" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052699 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="pull" Dec 04 12:30:00 crc kubenswrapper[4760]: E1204 12:30:00.052720 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="extract" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052727 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="extract" Dec 04 12:30:00 crc kubenswrapper[4760]: E1204 12:30:00.052737 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052743 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052908 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c791b99-3020-4b20-9d91-87c5ba9f615a" containerName="console" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.052923 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a7c8db-106b-48f4-a044-e604a1c6f934" containerName="extract" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.053599 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.058875 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.059107 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.059293 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.059459 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fqlfz" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.059659 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.072250 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.148418 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-apiservice-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.148491 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sn8r\" (UniqueName: \"kubernetes.io/projected/f4735108-14df-4389-af8f-d3e7c56eba8f-kube-api-access-4sn8r\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.148542 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-webhook-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.189308 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.190339 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.198957 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.198957 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.221598 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.249704 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wfmf\" (UniqueName: \"kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.249797 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.249832 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-apiservice-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.249862 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sn8r\" (UniqueName: \"kubernetes.io/projected/f4735108-14df-4389-af8f-d3e7c56eba8f-kube-api-access-4sn8r\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.250018 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-webhook-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.250087 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.265366 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-apiservice-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.267071 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f4735108-14df-4389-af8f-d3e7c56eba8f-webhook-cert\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.279841 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sn8r\" (UniqueName: \"kubernetes.io/projected/f4735108-14df-4389-af8f-d3e7c56eba8f-kube-api-access-4sn8r\") pod \"metallb-operator-controller-manager-645c4f57b7-84fcv\" (UID: \"f4735108-14df-4389-af8f-d3e7c56eba8f\") " pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.351690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.351764 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.351808 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wfmf\" (UniqueName: \"kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.352981 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.366034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.376944 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.380814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wfmf\" (UniqueName: \"kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf\") pod \"collect-profiles-29414190-f4ml4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.383241 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.384153 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.386490 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.387385 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q87j5" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.387689 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.401901 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.454464 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-apiservice-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.454568 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-webhook-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.454619 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxgm\" (UniqueName: \"kubernetes.io/projected/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-kube-api-access-xwxgm\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.506256 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.556000 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-webhook-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.556098 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxgm\" (UniqueName: \"kubernetes.io/projected/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-kube-api-access-xwxgm\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.556226 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-apiservice-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.562011 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-webhook-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.578058 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-apiservice-cert\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.586083 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxgm\" (UniqueName: \"kubernetes.io/projected/ce6e08a4-5fa8-42c9-929d-94af09b81ec2-kube-api-access-xwxgm\") pod \"metallb-operator-webhook-server-7578d999c8-hsqg8\" (UID: \"ce6e08a4-5fa8-42c9-929d-94af09b81ec2\") " pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.695464 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.739893 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.883001 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" event={"ID":"f4735108-14df-4389-af8f-d3e7c56eba8f","Type":"ContainerStarted","Data":"6c1024332c0b7b12dfb182abd04a80db2200a8118a01790764ed1a2e1578bd9b"} Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.972775 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4"] Dec 04 12:30:00 crc kubenswrapper[4760]: I1204 12:30:00.995741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8"] Dec 04 12:30:01 crc kubenswrapper[4760]: W1204 12:30:01.009680 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce6e08a4_5fa8_42c9_929d_94af09b81ec2.slice/crio-5dbed11e2d9c7368837dc785c8b083096b867b353788866e51f7597ab8029b00 WatchSource:0}: Error finding container 5dbed11e2d9c7368837dc785c8b083096b867b353788866e51f7597ab8029b00: Status 404 returned error can't find the container with id 5dbed11e2d9c7368837dc785c8b083096b867b353788866e51f7597ab8029b00 Dec 04 12:30:01 crc kubenswrapper[4760]: I1204 12:30:01.906772 4760 generic.go:334] "Generic (PLEG): container finished" podID="d27f1467-5844-498c-ab06-fdb1379f24b4" containerID="c976613583e16dd96390021f6bbbdeb2fcfdec7f23f7faefff7df9771fafb812" exitCode=0 Dec 04 12:30:01 crc kubenswrapper[4760]: I1204 12:30:01.906932 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" event={"ID":"d27f1467-5844-498c-ab06-fdb1379f24b4","Type":"ContainerDied","Data":"c976613583e16dd96390021f6bbbdeb2fcfdec7f23f7faefff7df9771fafb812"} Dec 04 12:30:01 crc kubenswrapper[4760]: I1204 12:30:01.906981 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" event={"ID":"d27f1467-5844-498c-ab06-fdb1379f24b4","Type":"ContainerStarted","Data":"b83db4bb3e4f98fb251c5190d500500b56b3134fd25d7055cb995e8e02ccbc1f"} Dec 04 12:30:01 crc kubenswrapper[4760]: I1204 12:30:01.911083 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" event={"ID":"ce6e08a4-5fa8-42c9-929d-94af09b81ec2","Type":"ContainerStarted","Data":"5dbed11e2d9c7368837dc785c8b083096b867b353788866e51f7597ab8029b00"} Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.297184 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.380825 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.380906 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.380969 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.381948 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.382024 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a" gracePeriod=600 Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.402241 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume\") pod \"d27f1467-5844-498c-ab06-fdb1379f24b4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.402361 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume\") pod \"d27f1467-5844-498c-ab06-fdb1379f24b4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.402506 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wfmf\" (UniqueName: \"kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf\") pod \"d27f1467-5844-498c-ab06-fdb1379f24b4\" (UID: \"d27f1467-5844-498c-ab06-fdb1379f24b4\") " Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.403140 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "d27f1467-5844-498c-ab06-fdb1379f24b4" (UID: "d27f1467-5844-498c-ab06-fdb1379f24b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.410122 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf" (OuterVolumeSpecName: "kube-api-access-8wfmf") pod "d27f1467-5844-498c-ab06-fdb1379f24b4" (UID: "d27f1467-5844-498c-ab06-fdb1379f24b4"). InnerVolumeSpecName "kube-api-access-8wfmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.410493 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d27f1467-5844-498c-ab06-fdb1379f24b4" (UID: "d27f1467-5844-498c-ab06-fdb1379f24b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.504539 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wfmf\" (UniqueName: \"kubernetes.io/projected/d27f1467-5844-498c-ab06-fdb1379f24b4-kube-api-access-8wfmf\") on node \"crc\" DevicePath \"\"" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.504708 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d27f1467-5844-498c-ab06-fdb1379f24b4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.504747 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d27f1467-5844-498c-ab06-fdb1379f24b4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.946081 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a" exitCode=0 Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.946166 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a"} Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.946198 4760 scope.go:117] "RemoveContainer" containerID="84ba279dadecee1653448131b89b76db4ce63ea0a7071f444225a6a7cbc815ba" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.949018 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" event={"ID":"d27f1467-5844-498c-ab06-fdb1379f24b4","Type":"ContainerDied","Data":"b83db4bb3e4f98fb251c5190d500500b56b3134fd25d7055cb995e8e02ccbc1f"} Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.949046 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b83db4bb3e4f98fb251c5190d500500b56b3134fd25d7055cb995e8e02ccbc1f" Dec 04 12:30:03 crc kubenswrapper[4760]: I1204 12:30:03.949106 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4" Dec 04 12:30:04 crc kubenswrapper[4760]: I1204 12:30:04.979060 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e"} Dec 04 12:30:05 crc kubenswrapper[4760]: I1204 12:30:05.988613 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" event={"ID":"f4735108-14df-4389-af8f-d3e7c56eba8f","Type":"ContainerStarted","Data":"c721116d1b730f6ef2490a83fedef8422465b5625c082b70fe0204cd64f2d225"} Dec 04 12:30:06 crc kubenswrapper[4760]: I1204 12:30:06.023102 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" podStartSLOduration=2.963237297 podStartE2EDuration="7.023079179s" podCreationTimestamp="2025-12-04 12:29:59 +0000 UTC" firstStartedPulling="2025-12-04 12:30:00.713032572 +0000 UTC m=+1003.754479139" lastFinishedPulling="2025-12-04 12:30:04.772874444 +0000 UTC m=+1007.814321021" observedRunningTime="2025-12-04 12:30:06.008575959 +0000 UTC m=+1009.050022526" watchObservedRunningTime="2025-12-04 12:30:06.023079179 +0000 UTC m=+1009.064525746" Dec 04 12:30:06 crc kubenswrapper[4760]: I1204 12:30:06.995023 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:08 crc kubenswrapper[4760]: I1204 12:30:08.003020 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" event={"ID":"ce6e08a4-5fa8-42c9-929d-94af09b81ec2","Type":"ContainerStarted","Data":"2559eea444fbcc1627fd86061864f7f1595fd9002e34359d92b80b1f2acf0824"} Dec 04 12:30:08 crc kubenswrapper[4760]: I1204 12:30:08.003757 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:08 crc kubenswrapper[4760]: I1204 12:30:08.023582 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" podStartSLOduration=2.072378713 podStartE2EDuration="8.023560925s" podCreationTimestamp="2025-12-04 12:30:00 +0000 UTC" firstStartedPulling="2025-12-04 12:30:01.014183465 +0000 UTC m=+1004.055630032" lastFinishedPulling="2025-12-04 12:30:06.965365687 +0000 UTC m=+1010.006812244" observedRunningTime="2025-12-04 12:30:08.022321066 +0000 UTC m=+1011.063767653" watchObservedRunningTime="2025-12-04 12:30:08.023560925 +0000 UTC m=+1011.065007492" Dec 04 12:30:20 crc kubenswrapper[4760]: I1204 12:30:20.744717 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7578d999c8-hsqg8" Dec 04 12:30:40 crc kubenswrapper[4760]: I1204 12:30:40.879402 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-645c4f57b7-84fcv" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.559420 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n4b8p"] Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.559941 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27f1467-5844-498c-ab06-fdb1379f24b4" containerName="collect-profiles" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.559953 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27f1467-5844-498c-ab06-fdb1379f24b4" containerName="collect-profiles" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.560068 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27f1467-5844-498c-ab06-fdb1379f24b4" containerName="collect-profiles" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.562225 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.564584 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.564843 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.565160 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-957qs" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.570961 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g"] Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.571889 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.574700 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.588283 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g"] Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.670121 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-rxp6m"] Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.671162 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.677749 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.678003 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tsbfn" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.678872 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.679006 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687246 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-startup\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687272 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-sockets\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687292 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687411 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9vqb\" (UniqueName: \"kubernetes.io/projected/c29571e9-5b28-43ef-a429-9af2daa6f4bc-kube-api-access-q9vqb\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvhd\" (UniqueName: \"kubernetes.io/projected/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-kube-api-access-7hvhd\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687525 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-reloader\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687568 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-conf\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.687631 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.703874 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-9vpvj"] Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.705029 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.715239 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.731416 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9vpvj"] Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.788834 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-conf\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.788908 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.788950 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789006 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789033 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxszj\" (UniqueName: \"kubernetes.io/projected/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-kube-api-access-bxszj\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.789151 4760 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.789252 4760 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789300 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-startup\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.789340 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert podName:49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae nodeName:}" failed. No retries permitted until 2025-12-04 12:30:42.289315604 +0000 UTC m=+1045.330762261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert") pod "frr-k8s-webhook-server-7fcb986d4-5pb2g" (UID: "49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae") : secret "frr-k8s-webhook-server-cert" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.789360 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs podName:c29571e9-5b28-43ef-a429-9af2daa6f4bc nodeName:}" failed. No retries permitted until 2025-12-04 12:30:42.289351255 +0000 UTC m=+1045.330797942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs") pod "frr-k8s-n4b8p" (UID: "c29571e9-5b28-43ef-a429-9af2daa6f4bc") : secret "frr-k8s-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-conf\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789387 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-sockets\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789595 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.789673 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metallb-excludel2\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-sockets\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790095 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9vqb\" (UniqueName: \"kubernetes.io/projected/c29571e9-5b28-43ef-a429-9af2daa6f4bc-kube-api-access-q9vqb\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790127 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvhd\" (UniqueName: \"kubernetes.io/projected/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-kube-api-access-7hvhd\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790162 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-reloader\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790191 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790331 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c29571e9-5b28-43ef-a429-9af2daa6f4bc-frr-startup\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.790460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c29571e9-5b28-43ef-a429-9af2daa6f4bc-reloader\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.818108 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvhd\" (UniqueName: \"kubernetes.io/projected/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-kube-api-access-7hvhd\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.826827 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9vqb\" (UniqueName: \"kubernetes.io/projected/c29571e9-5b28-43ef-a429-9af2daa6f4bc-kube-api-access-q9vqb\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894162 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894198 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894304 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxszj\" (UniqueName: \"kubernetes.io/projected/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-kube-api-access-bxszj\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894341 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.894345 4760 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.894414 4760 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.894427 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist podName:986d9828-3e9c-4b9a-bdc5-aaa3eb184641 nodeName:}" failed. No retries permitted until 2025-12-04 12:30:42.39440238 +0000 UTC m=+1045.435848947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist") pod "speaker-rxp6m" (UID: "986d9828-3e9c-4b9a-bdc5-aaa3eb184641") : secret "metallb-memberlist" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.894492 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs podName:986d9828-3e9c-4b9a-bdc5-aaa3eb184641 nodeName:}" failed. No retries permitted until 2025-12-04 12:30:42.394473563 +0000 UTC m=+1045.435920220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs") pod "speaker-rxp6m" (UID: "986d9828-3e9c-4b9a-bdc5-aaa3eb184641") : secret "speaker-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4qj\" (UniqueName: \"kubernetes.io/projected/66980e43-ab0c-4f0e-a66b-1ba0047809d2-kube-api-access-2f4qj\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894626 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metallb-excludel2\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.894677 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-cert\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.895544 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metallb-excludel2\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.922015 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxszj\" (UniqueName: \"kubernetes.io/projected/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-kube-api-access-bxszj\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.996076 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.996141 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4qj\" (UniqueName: \"kubernetes.io/projected/66980e43-ab0c-4f0e-a66b-1ba0047809d2-kube-api-access-2f4qj\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.996183 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-cert\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.996299 4760 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: E1204 12:30:41.996385 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs podName:66980e43-ab0c-4f0e-a66b-1ba0047809d2 nodeName:}" failed. No retries permitted until 2025-12-04 12:30:42.496362798 +0000 UTC m=+1045.537809425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs") pod "controller-f8648f98b-9vpvj" (UID: "66980e43-ab0c-4f0e-a66b-1ba0047809d2") : secret "controller-certs-secret" not found Dec 04 12:30:41 crc kubenswrapper[4760]: I1204 12:30:41.998636 4760 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.010751 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-cert\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.018769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4qj\" (UniqueName: \"kubernetes.io/projected/66980e43-ab0c-4f0e-a66b-1ba0047809d2-kube-api-access-2f4qj\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.298998 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.299057 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.302598 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c29571e9-5b28-43ef-a429-9af2daa6f4bc-metrics-certs\") pod \"frr-k8s-n4b8p\" (UID: \"c29571e9-5b28-43ef-a429-9af2daa6f4bc\") " pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.302837 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5pb2g\" (UID: \"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.400291 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.400339 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:42 crc kubenswrapper[4760]: E1204 12:30:42.400413 4760 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 12:30:42 crc kubenswrapper[4760]: E1204 12:30:42.400473 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist podName:986d9828-3e9c-4b9a-bdc5-aaa3eb184641 nodeName:}" failed. No retries permitted until 2025-12-04 12:30:43.400458679 +0000 UTC m=+1046.441905246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist") pod "speaker-rxp6m" (UID: "986d9828-3e9c-4b9a-bdc5-aaa3eb184641") : secret "metallb-memberlist" not found Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.403601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-metrics-certs\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.484498 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.496508 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.501051 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.505512 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66980e43-ab0c-4f0e-a66b-1ba0047809d2-metrics-certs\") pod \"controller-f8648f98b-9vpvj\" (UID: \"66980e43-ab0c-4f0e-a66b-1ba0047809d2\") " pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.623057 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.874741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-9vpvj"] Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.903737 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vpvj" event={"ID":"66980e43-ab0c-4f0e-a66b-1ba0047809d2","Type":"ContainerStarted","Data":"00447c0a07528f542e1f7727951164b226cf812b456b4e4bcd39df9f779815b0"} Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.905366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"f2c7b2c9dd9248c5b5ce3148e1b19e088bf2ba6aaf81f1f46aedcd46dfbeb17d"} Dec 04 12:30:42 crc kubenswrapper[4760]: I1204 12:30:42.951422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g"] Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.418561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.426188 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/986d9828-3e9c-4b9a-bdc5-aaa3eb184641-memberlist\") pod \"speaker-rxp6m\" (UID: \"986d9828-3e9c-4b9a-bdc5-aaa3eb184641\") " pod="metallb-system/speaker-rxp6m" Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.485644 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-rxp6m" Dec 04 12:30:43 crc kubenswrapper[4760]: W1204 12:30:43.507864 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986d9828_3e9c_4b9a_bdc5_aaa3eb184641.slice/crio-319b31ec5f71544599808e2d48e4677b1b0a4207c393eb49736f850dd25afe75 WatchSource:0}: Error finding container 319b31ec5f71544599808e2d48e4677b1b0a4207c393eb49736f850dd25afe75: Status 404 returned error can't find the container with id 319b31ec5f71544599808e2d48e4677b1b0a4207c393eb49736f850dd25afe75 Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.915368 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rxp6m" event={"ID":"986d9828-3e9c-4b9a-bdc5-aaa3eb184641","Type":"ContainerStarted","Data":"324f8fdff1960f8a2fd438260354423291452a23b6f0d28bd6ef28570cbd20e5"} Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.915812 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rxp6m" event={"ID":"986d9828-3e9c-4b9a-bdc5-aaa3eb184641","Type":"ContainerStarted","Data":"319b31ec5f71544599808e2d48e4677b1b0a4207c393eb49736f850dd25afe75"} Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.927935 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" event={"ID":"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae","Type":"ContainerStarted","Data":"c0cf5f2697f5aa2093f28e9ab1dea5b70aeb853c0e3818bd6b8c8df8f357a5d7"} Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.933724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vpvj" event={"ID":"66980e43-ab0c-4f0e-a66b-1ba0047809d2","Type":"ContainerStarted","Data":"11bc686a98307e83fabd810a70e1cdc58a85bb09ddbf1e560f0a88ac40b8cf38"} Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.933794 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-9vpvj" event={"ID":"66980e43-ab0c-4f0e-a66b-1ba0047809d2","Type":"ContainerStarted","Data":"16751dd9f77cd221eecb8021f3fe560fef3f08e195b6f98e2412a9bf2de72cdf"} Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.934070 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:30:43 crc kubenswrapper[4760]: I1204 12:30:43.965689 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-9vpvj" podStartSLOduration=2.965654056 podStartE2EDuration="2.965654056s" podCreationTimestamp="2025-12-04 12:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:30:43.958804668 +0000 UTC m=+1047.000251235" watchObservedRunningTime="2025-12-04 12:30:43.965654056 +0000 UTC m=+1047.007100623" Dec 04 12:30:44 crc kubenswrapper[4760]: I1204 12:30:44.952915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-rxp6m" event={"ID":"986d9828-3e9c-4b9a-bdc5-aaa3eb184641","Type":"ContainerStarted","Data":"0fdb146b7c948dcb8875c362cc9199435b7d93d4301eb52ba053076726a6c54a"} Dec 04 12:30:44 crc kubenswrapper[4760]: I1204 12:30:44.980613 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-rxp6m" podStartSLOduration=3.98058171 podStartE2EDuration="3.98058171s" podCreationTimestamp="2025-12-04 12:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:30:44.979373872 +0000 UTC m=+1048.020820459" watchObservedRunningTime="2025-12-04 12:30:44.98058171 +0000 UTC m=+1048.022028277" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.143542 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sqqhg"] Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.145078 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.172267 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqhg"] Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.247585 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-utilities\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.247731 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-catalog-content\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.247778 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrfh9\" (UniqueName: \"kubernetes.io/projected/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-kube-api-access-xrfh9\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.353269 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-catalog-content\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.353342 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrfh9\" (UniqueName: \"kubernetes.io/projected/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-kube-api-access-xrfh9\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.353406 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-utilities\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.354325 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-utilities\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.354656 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-catalog-content\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.395164 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrfh9\" (UniqueName: \"kubernetes.io/projected/b4c2d28f-e4ef-41bc-8769-83eb39cf2569-kube-api-access-xrfh9\") pod \"community-operators-sqqhg\" (UID: \"b4c2d28f-e4ef-41bc-8769-83eb39cf2569\") " pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.467882 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.944964 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqhg"] Dec 04 12:30:45 crc kubenswrapper[4760]: I1204 12:30:45.966128 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-rxp6m" Dec 04 12:30:46 crc kubenswrapper[4760]: I1204 12:30:46.976329 4760 generic.go:334] "Generic (PLEG): container finished" podID="b4c2d28f-e4ef-41bc-8769-83eb39cf2569" containerID="f9de1a2cb7e354843684a36d21d8d83152ffe23eacf5f65c509c6db53e7a1ccb" exitCode=0 Dec 04 12:30:46 crc kubenswrapper[4760]: I1204 12:30:46.976539 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqhg" event={"ID":"b4c2d28f-e4ef-41bc-8769-83eb39cf2569","Type":"ContainerDied","Data":"f9de1a2cb7e354843684a36d21d8d83152ffe23eacf5f65c509c6db53e7a1ccb"} Dec 04 12:30:46 crc kubenswrapper[4760]: I1204 12:30:46.976588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqhg" event={"ID":"b4c2d28f-e4ef-41bc-8769-83eb39cf2569","Type":"ContainerStarted","Data":"c1d4263cc6802185e17bba642f3e2948c9cac59ae63d58dae56f0a93881d983d"} Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.738275 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.740311 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.755333 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.904517 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.904682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:51 crc kubenswrapper[4760]: I1204 12:30:51.904871 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5vp\" (UniqueName: \"kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.005976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.006062 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.006133 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5vp\" (UniqueName: \"kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.006644 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.006966 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.035125 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5vp\" (UniqueName: \"kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp\") pod \"certified-operators-hdmwf\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:52 crc kubenswrapper[4760]: I1204 12:30:52.063884 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:30:53 crc kubenswrapper[4760]: I1204 12:30:53.490365 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-rxp6m" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.201796 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.203188 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.205123 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.205268 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kzr87" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.205363 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.216741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.271604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64cch\" (UniqueName: \"kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch\") pod \"openstack-operator-index-nwwhw\" (UID: \"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c\") " pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.373506 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64cch\" (UniqueName: \"kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch\") pod \"openstack-operator-index-nwwhw\" (UID: \"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c\") " pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.393566 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64cch\" (UniqueName: \"kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch\") pod \"openstack-operator-index-nwwhw\" (UID: \"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c\") " pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:30:56 crc kubenswrapper[4760]: I1204 12:30:56.520839 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:30:57 crc kubenswrapper[4760]: I1204 12:30:57.873389 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:30:57 crc kubenswrapper[4760]: W1204 12:30:57.881180 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef784e2_c0eb_4595_8f77_ebef054e2eb9.slice/crio-3d5e2c781ac60397458aa44e1f9c030b7f0b7e45733adc7e6bf3b1f46e20ec24 WatchSource:0}: Error finding container 3d5e2c781ac60397458aa44e1f9c030b7f0b7e45733adc7e6bf3b1f46e20ec24: Status 404 returned error can't find the container with id 3d5e2c781ac60397458aa44e1f9c030b7f0b7e45733adc7e6bf3b1f46e20ec24 Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.027381 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.057964 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" event={"ID":"49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae","Type":"ContainerStarted","Data":"eb3b8bd05b60a038de93ea54a64066fc7e9967ef64eb77d2b537fad202a370b4"} Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.059337 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.061578 4760 generic.go:334] "Generic (PLEG): container finished" podID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerID="702eb2a4f4c4fecc9cbbf3e005ec8350bbb3c63d6c3830be71dff1b0895db27a" exitCode=0 Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.061639 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerDied","Data":"702eb2a4f4c4fecc9cbbf3e005ec8350bbb3c63d6c3830be71dff1b0895db27a"} Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.061660 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerStarted","Data":"3d5e2c781ac60397458aa44e1f9c030b7f0b7e45733adc7e6bf3b1f46e20ec24"} Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.065625 4760 generic.go:334] "Generic (PLEG): container finished" podID="b4c2d28f-e4ef-41bc-8769-83eb39cf2569" containerID="d072122bcabdcaee664bcbdf9b88e88d0ecc4c247303aae41911768e46033723" exitCode=0 Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.065694 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqhg" event={"ID":"b4c2d28f-e4ef-41bc-8769-83eb39cf2569","Type":"ContainerDied","Data":"d072122bcabdcaee664bcbdf9b88e88d0ecc4c247303aae41911768e46033723"} Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.067454 4760 generic.go:334] "Generic (PLEG): container finished" podID="c29571e9-5b28-43ef-a429-9af2daa6f4bc" containerID="a9c4258f4a1fc9f7aee93158828166d5780561b8ffb72dc6826e78d189e6c8f4" exitCode=0 Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.067515 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerDied","Data":"a9c4258f4a1fc9f7aee93158828166d5780561b8ffb72dc6826e78d189e6c8f4"} Dec 04 12:30:58 crc kubenswrapper[4760]: I1204 12:30:58.080524 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" podStartSLOduration=2.58249971 podStartE2EDuration="17.080501228s" podCreationTimestamp="2025-12-04 12:30:41 +0000 UTC" firstStartedPulling="2025-12-04 12:30:42.963721853 +0000 UTC m=+1046.005168420" lastFinishedPulling="2025-12-04 12:30:57.461723371 +0000 UTC m=+1060.503169938" observedRunningTime="2025-12-04 12:30:58.077951687 +0000 UTC m=+1061.119398254" watchObservedRunningTime="2025-12-04 12:30:58.080501228 +0000 UTC m=+1061.121947795" Dec 04 12:30:58 crc kubenswrapper[4760]: W1204 12:30:58.130332 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d8ece9_0c01_4d2f_a8b8_b6cc7a3ed06c.slice/crio-df54929a5bdade38b9a024627eecf542041406cb9c74c14cb639f5dfb6e796a8 WatchSource:0}: Error finding container df54929a5bdade38b9a024627eecf542041406cb9c74c14cb639f5dfb6e796a8: Status 404 returned error can't find the container with id df54929a5bdade38b9a024627eecf542041406cb9c74c14cb639f5dfb6e796a8 Dec 04 12:30:59 crc kubenswrapper[4760]: I1204 12:30:59.076501 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwwhw" event={"ID":"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c","Type":"ContainerStarted","Data":"df54929a5bdade38b9a024627eecf542041406cb9c74c14cb639f5dfb6e796a8"} Dec 04 12:30:59 crc kubenswrapper[4760]: I1204 12:30:59.079045 4760 generic.go:334] "Generic (PLEG): container finished" podID="c29571e9-5b28-43ef-a429-9af2daa6f4bc" containerID="331ecaeeb203e68452c6fe1ee01b7a099925937fd49abbfe1e3c1954051497de" exitCode=0 Dec 04 12:30:59 crc kubenswrapper[4760]: I1204 12:30:59.079123 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerDied","Data":"331ecaeeb203e68452c6fe1ee01b7a099925937fd49abbfe1e3c1954051497de"} Dec 04 12:30:59 crc kubenswrapper[4760]: I1204 12:30:59.988994 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.592424 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k2f7j"] Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.593968 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.610185 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k2f7j"] Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.646392 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nrkp\" (UniqueName: \"kubernetes.io/projected/a4e176f0-09d3-4710-a8a7-32cd09f03c4d-kube-api-access-7nrkp\") pod \"openstack-operator-index-k2f7j\" (UID: \"a4e176f0-09d3-4710-a8a7-32cd09f03c4d\") " pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.747809 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nrkp\" (UniqueName: \"kubernetes.io/projected/a4e176f0-09d3-4710-a8a7-32cd09f03c4d-kube-api-access-7nrkp\") pod \"openstack-operator-index-k2f7j\" (UID: \"a4e176f0-09d3-4710-a8a7-32cd09f03c4d\") " pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.773022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nrkp\" (UniqueName: \"kubernetes.io/projected/a4e176f0-09d3-4710-a8a7-32cd09f03c4d-kube-api-access-7nrkp\") pod \"openstack-operator-index-k2f7j\" (UID: \"a4e176f0-09d3-4710-a8a7-32cd09f03c4d\") " pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:00 crc kubenswrapper[4760]: I1204 12:31:00.927932 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:01 crc kubenswrapper[4760]: I1204 12:31:01.096200 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sqqhg" event={"ID":"b4c2d28f-e4ef-41bc-8769-83eb39cf2569","Type":"ContainerStarted","Data":"07f805be872c8de89f23cbfd6af7cd58fce8e57a22fbeaec53d1062ec4eecbc2"} Dec 04 12:31:01 crc kubenswrapper[4760]: I1204 12:31:01.132984 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sqqhg" podStartSLOduration=3.998317555 podStartE2EDuration="16.132969035s" podCreationTimestamp="2025-12-04 12:30:45 +0000 UTC" firstStartedPulling="2025-12-04 12:30:46.979807956 +0000 UTC m=+1050.021254533" lastFinishedPulling="2025-12-04 12:30:59.114459446 +0000 UTC m=+1062.155906013" observedRunningTime="2025-12-04 12:31:01.131309222 +0000 UTC m=+1064.172755789" watchObservedRunningTime="2025-12-04 12:31:01.132969035 +0000 UTC m=+1064.174415602" Dec 04 12:31:01 crc kubenswrapper[4760]: I1204 12:31:01.764077 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k2f7j"] Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.103332 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwwhw" event={"ID":"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c","Type":"ContainerStarted","Data":"41e028f3ff6d75a9c861676e5e0bdeb4cbe26a8abb0aede56250411175e0ec27"} Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.103492 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nwwhw" podUID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" containerName="registry-server" containerID="cri-o://41e028f3ff6d75a9c861676e5e0bdeb4cbe26a8abb0aede56250411175e0ec27" gracePeriod=2 Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.106129 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k2f7j" event={"ID":"a4e176f0-09d3-4710-a8a7-32cd09f03c4d","Type":"ContainerStarted","Data":"6e397d698a29eb40675265fb718c028ab406c641fd9519254b42274f241cb705"} Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.106172 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k2f7j" event={"ID":"a4e176f0-09d3-4710-a8a7-32cd09f03c4d","Type":"ContainerStarted","Data":"47588388b19fa9254531e31f0e7d6306a5ef94504bcb283b804da09369f6591b"} Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.116362 4760 generic.go:334] "Generic (PLEG): container finished" podID="c29571e9-5b28-43ef-a429-9af2daa6f4bc" containerID="6fa6e874382f3118563b1384cf1f5cc949c0e8b89c025bdfe6149a3e4f56fef6" exitCode=0 Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.116480 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerDied","Data":"6fa6e874382f3118563b1384cf1f5cc949c0e8b89c025bdfe6149a3e4f56fef6"} Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.138437 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nwwhw" podStartSLOduration=3.318879077 podStartE2EDuration="6.138419428s" podCreationTimestamp="2025-12-04 12:30:56 +0000 UTC" firstStartedPulling="2025-12-04 12:30:58.133512381 +0000 UTC m=+1061.174958948" lastFinishedPulling="2025-12-04 12:31:00.953052732 +0000 UTC m=+1063.994499299" observedRunningTime="2025-12-04 12:31:02.13597248 +0000 UTC m=+1065.177419047" watchObservedRunningTime="2025-12-04 12:31:02.138419428 +0000 UTC m=+1065.179865995" Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.139917 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerStarted","Data":"17e80fabf3efd7893290d66ea4a71f639fddd56157a03ea0f81efecb9e6d83f7"} Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.196773 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k2f7j" podStartSLOduration=2.133425879 podStartE2EDuration="2.19673719s" podCreationTimestamp="2025-12-04 12:31:00 +0000 UTC" firstStartedPulling="2025-12-04 12:31:01.785610506 +0000 UTC m=+1064.827057073" lastFinishedPulling="2025-12-04 12:31:01.848921817 +0000 UTC m=+1064.890368384" observedRunningTime="2025-12-04 12:31:02.193640072 +0000 UTC m=+1065.235086639" watchObservedRunningTime="2025-12-04 12:31:02.19673719 +0000 UTC m=+1065.238183757" Dec 04 12:31:02 crc kubenswrapper[4760]: I1204 12:31:02.758854 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-9vpvj" Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.251306 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"7e42e4ed5b7fe5f93cffd50497113c0520a2e2e2f8f1e1704337c3cd05e2fdfe"} Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.253811 4760 generic.go:334] "Generic (PLEG): container finished" podID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerID="17e80fabf3efd7893290d66ea4a71f639fddd56157a03ea0f81efecb9e6d83f7" exitCode=0 Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.253929 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerDied","Data":"17e80fabf3efd7893290d66ea4a71f639fddd56157a03ea0f81efecb9e6d83f7"} Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.257082 4760 generic.go:334] "Generic (PLEG): container finished" podID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" containerID="41e028f3ff6d75a9c861676e5e0bdeb4cbe26a8abb0aede56250411175e0ec27" exitCode=0 Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.258007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwwhw" event={"ID":"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c","Type":"ContainerDied","Data":"41e028f3ff6d75a9c861676e5e0bdeb4cbe26a8abb0aede56250411175e0ec27"} Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.418949 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.544365 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64cch\" (UniqueName: \"kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch\") pod \"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c\" (UID: \"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c\") " Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.550398 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch" (OuterVolumeSpecName: "kube-api-access-64cch") pod "b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" (UID: "b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c"). InnerVolumeSpecName "kube-api-access-64cch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:31:03 crc kubenswrapper[4760]: I1204 12:31:03.645554 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64cch\" (UniqueName: \"kubernetes.io/projected/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c-kube-api-access-64cch\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.273191 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"b1a4534be76e9762f4fecc9ed81a458ba53a0d03cd09a7884995d034701a3777"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.273597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"db2ca0f29b87174289f8a025fc6732feba64d65138b786324f53f335fee86e3e"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.273613 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"564776af5e0c6ea6d24d95465dbf865fe59ed1b2b7a9e962366f79dc8583c8df"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.273625 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"1e2258d674d3d1d21e47795ba77cb9fe1ab0aa1d19208b5d228e1d0651e3c63d"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.276896 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerStarted","Data":"be0ab6c872329b843e040e8eed3448d7eb4a700b2a4eb94950921605d6a3c731"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.279736 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nwwhw" event={"ID":"b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c","Type":"ContainerDied","Data":"df54929a5bdade38b9a024627eecf542041406cb9c74c14cb639f5dfb6e796a8"} Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.279889 4760 scope.go:117] "RemoveContainer" containerID="41e028f3ff6d75a9c861676e5e0bdeb4cbe26a8abb0aede56250411175e0ec27" Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.280027 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nwwhw" Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.304721 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdmwf" podStartSLOduration=7.345494091 podStartE2EDuration="13.304704029s" podCreationTimestamp="2025-12-04 12:30:51 +0000 UTC" firstStartedPulling="2025-12-04 12:30:58.062690842 +0000 UTC m=+1061.104137409" lastFinishedPulling="2025-12-04 12:31:04.02190077 +0000 UTC m=+1067.063347347" observedRunningTime="2025-12-04 12:31:04.298480661 +0000 UTC m=+1067.339927248" watchObservedRunningTime="2025-12-04 12:31:04.304704029 +0000 UTC m=+1067.346150596" Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.320452 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:31:04 crc kubenswrapper[4760]: I1204 12:31:04.326408 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nwwhw"] Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.296062 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n4b8p" event={"ID":"c29571e9-5b28-43ef-a429-9af2daa6f4bc","Type":"ContainerStarted","Data":"e41ac0595bc0bd4c8a272a06190573e32028b916b6393b0ca0adf4e1494e0779"} Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.296795 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.331135 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n4b8p" podStartSLOduration=9.541244502 podStartE2EDuration="24.331101027s" podCreationTimestamp="2025-12-04 12:30:41 +0000 UTC" firstStartedPulling="2025-12-04 12:30:42.6367368 +0000 UTC m=+1045.678183367" lastFinishedPulling="2025-12-04 12:30:57.426593325 +0000 UTC m=+1060.468039892" observedRunningTime="2025-12-04 12:31:05.323190686 +0000 UTC m=+1068.364637253" watchObservedRunningTime="2025-12-04 12:31:05.331101027 +0000 UTC m=+1068.372547594" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.468683 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.468757 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.683007 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.875483 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" path="/var/lib/kubelet/pods/b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c/volumes" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.995411 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:05 crc kubenswrapper[4760]: E1204 12:31:05.995714 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" containerName="registry-server" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.995730 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" containerName="registry-server" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.996372 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d8ece9-0c01-4d2f-a8b8-b6cc7a3ed06c" containerName="registry-server" Dec 04 12:31:05 crc kubenswrapper[4760]: I1204 12:31:05.998062 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.018083 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.122889 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29df\" (UniqueName: \"kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.123032 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.123087 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.224875 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29df\" (UniqueName: \"kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.224982 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.225048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.225581 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.225792 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.251314 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29df\" (UniqueName: \"kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df\") pod \"redhat-marketplace-cvcck\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.321471 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.358490 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sqqhg" Dec 04 12:31:06 crc kubenswrapper[4760]: I1204 12:31:06.823057 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:07 crc kubenswrapper[4760]: I1204 12:31:07.314613 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerStarted","Data":"a7669faabf9812dfd9e5bdc66c66b3a1e7afadbcc9045990334254e8d0f124d3"} Dec 04 12:31:07 crc kubenswrapper[4760]: I1204 12:31:07.314714 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerStarted","Data":"fc0d25833f02d8b3d41cc34c36fddc17f5f2c3d8907f82181c0582255a9ade90"} Dec 04 12:31:07 crc kubenswrapper[4760]: I1204 12:31:07.485391 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:31:07 crc kubenswrapper[4760]: I1204 12:31:07.527293 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:31:08 crc kubenswrapper[4760]: I1204 12:31:08.324267 4760 generic.go:334] "Generic (PLEG): container finished" podID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerID="a7669faabf9812dfd9e5bdc66c66b3a1e7afadbcc9045990334254e8d0f124d3" exitCode=0 Dec 04 12:31:08 crc kubenswrapper[4760]: I1204 12:31:08.324354 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerDied","Data":"a7669faabf9812dfd9e5bdc66c66b3a1e7afadbcc9045990334254e8d0f124d3"} Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.339238 4760 generic.go:334] "Generic (PLEG): container finished" podID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerID="20e3c198235effd9996601247bc5d4d8e4d6495aa16a166977b50bbb75d56d70" exitCode=0 Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.339338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerDied","Data":"20e3c198235effd9996601247bc5d4d8e4d6495aa16a166977b50bbb75d56d70"} Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.805164 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sqqhg"] Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.928609 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.928667 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:10 crc kubenswrapper[4760]: I1204 12:31:10.960522 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:11 crc kubenswrapper[4760]: I1204 12:31:11.382878 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:31:11 crc kubenswrapper[4760]: I1204 12:31:11.383363 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qd697" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="registry-server" containerID="cri-o://d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c" gracePeriod=2 Dec 04 12:31:11 crc kubenswrapper[4760]: I1204 12:31:11.383454 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k2f7j" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.064912 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.066100 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.155446 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.307442 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.323099 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities\") pod \"cd76407f-83a3-4ef8-8f86-871ca466e436\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.323275 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzbmq\" (UniqueName: \"kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq\") pod \"cd76407f-83a3-4ef8-8f86-871ca466e436\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.323472 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content\") pod \"cd76407f-83a3-4ef8-8f86-871ca466e436\" (UID: \"cd76407f-83a3-4ef8-8f86-871ca466e436\") " Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.330265 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities" (OuterVolumeSpecName: "utilities") pod "cd76407f-83a3-4ef8-8f86-871ca466e436" (UID: "cd76407f-83a3-4ef8-8f86-871ca466e436"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.340633 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq" (OuterVolumeSpecName: "kube-api-access-vzbmq") pod "cd76407f-83a3-4ef8-8f86-871ca466e436" (UID: "cd76407f-83a3-4ef8-8f86-871ca466e436"). InnerVolumeSpecName "kube-api-access-vzbmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.359263 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerStarted","Data":"8a592d69905d3e31bb390b269f0c810a0b8771d856f083a595a49108627c1430"} Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.372503 4760 generic.go:334] "Generic (PLEG): container finished" podID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerID="d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c" exitCode=0 Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.372590 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerDied","Data":"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c"} Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.372850 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qd697" event={"ID":"cd76407f-83a3-4ef8-8f86-871ca466e436","Type":"ContainerDied","Data":"c0611821daf6026ee1261180748e68287432564f206a0c0d69decf38fb3447aa"} Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.372889 4760 scope.go:117] "RemoveContainer" containerID="d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.372619 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qd697" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.395627 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cvcck" podStartSLOduration=4.214403475 podStartE2EDuration="7.395605588s" podCreationTimestamp="2025-12-04 12:31:05 +0000 UTC" firstStartedPulling="2025-12-04 12:31:08.327280047 +0000 UTC m=+1071.368726614" lastFinishedPulling="2025-12-04 12:31:11.50848216 +0000 UTC m=+1074.549928727" observedRunningTime="2025-12-04 12:31:12.391839038 +0000 UTC m=+1075.433285625" watchObservedRunningTime="2025-12-04 12:31:12.395605588 +0000 UTC m=+1075.437052155" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.422866 4760 scope.go:117] "RemoveContainer" containerID="d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.425521 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd76407f-83a3-4ef8-8f86-871ca466e436" (UID: "cd76407f-83a3-4ef8-8f86-871ca466e436"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.426046 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.426072 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd76407f-83a3-4ef8-8f86-871ca466e436-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.426086 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzbmq\" (UniqueName: \"kubernetes.io/projected/cd76407f-83a3-4ef8-8f86-871ca466e436-kube-api-access-vzbmq\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.479891 4760 scope.go:117] "RemoveContainer" containerID="6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.484988 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.521946 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5pb2g" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.526400 4760 scope.go:117] "RemoveContainer" containerID="d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c" Dec 04 12:31:12 crc kubenswrapper[4760]: E1204 12:31:12.529333 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c\": container with ID starting with d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c not found: ID does not exist" containerID="d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.529378 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c"} err="failed to get container status \"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c\": rpc error: code = NotFound desc = could not find container \"d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c\": container with ID starting with d574cd967e1d2642470025ac0fc5bb942d1ca8be77a927e05b1237c4d36a215c not found: ID does not exist" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.529409 4760 scope.go:117] "RemoveContainer" containerID="d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8" Dec 04 12:31:12 crc kubenswrapper[4760]: E1204 12:31:12.530108 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8\": container with ID starting with d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8 not found: ID does not exist" containerID="d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.530140 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8"} err="failed to get container status \"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8\": rpc error: code = NotFound desc = could not find container \"d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8\": container with ID starting with d01822f0af2f58d3c9f11e30605a8483893234f005df07641f8eba345509baf8 not found: ID does not exist" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.530161 4760 scope.go:117] "RemoveContainer" containerID="6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef" Dec 04 12:31:12 crc kubenswrapper[4760]: E1204 12:31:12.531456 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef\": container with ID starting with 6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef not found: ID does not exist" containerID="6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.531490 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef"} err="failed to get container status \"6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef\": rpc error: code = NotFound desc = could not find container \"6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef\": container with ID starting with 6dbbd22b1dda1289a634abb14e457843a91dd0658890cf29d3301d226a35e7ef not found: ID does not exist" Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.726373 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:31:12 crc kubenswrapper[4760]: I1204 12:31:12.733177 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qd697"] Dec 04 12:31:13 crc kubenswrapper[4760]: I1204 12:31:13.873754 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" path="/var/lib/kubelet/pods/cd76407f-83a3-4ef8-8f86-871ca466e436/volumes" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.230799 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh"] Dec 04 12:31:14 crc kubenswrapper[4760]: E1204 12:31:14.231178 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="extract-utilities" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.231197 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="extract-utilities" Dec 04 12:31:14 crc kubenswrapper[4760]: E1204 12:31:14.231224 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="extract-content" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.231232 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="extract-content" Dec 04 12:31:14 crc kubenswrapper[4760]: E1204 12:31:14.231252 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="registry-server" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.231262 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="registry-server" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.231432 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd76407f-83a3-4ef8-8f86-871ca466e436" containerName="registry-server" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.232525 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.234848 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-74hlv" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.244539 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh"] Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.413763 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.414160 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzsg\" (UniqueName: \"kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.414475 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.515561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzsg\" (UniqueName: \"kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.515651 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.515699 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.516293 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.516343 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.538381 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzsg\" (UniqueName: \"kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg\") pod \"4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.550332 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:14 crc kubenswrapper[4760]: I1204 12:31:14.832644 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh"] Dec 04 12:31:14 crc kubenswrapper[4760]: W1204 12:31:14.842361 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ca5bd9_ed9a_4568_8b09_57e66e9ad187.slice/crio-6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1 WatchSource:0}: Error finding container 6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1: Status 404 returned error can't find the container with id 6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1 Dec 04 12:31:15 crc kubenswrapper[4760]: I1204 12:31:15.383572 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:31:15 crc kubenswrapper[4760]: I1204 12:31:15.396408 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerStarted","Data":"700931db30deb9fb5117907c4911c18937338cb33cece5e0cff14cf69bacff87"} Dec 04 12:31:15 crc kubenswrapper[4760]: I1204 12:31:15.396729 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerStarted","Data":"6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1"} Dec 04 12:31:15 crc kubenswrapper[4760]: I1204 12:31:15.396633 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdmwf" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="registry-server" containerID="cri-o://be0ab6c872329b843e040e8eed3448d7eb4a700b2a4eb94950921605d6a3c731" gracePeriod=2 Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.322601 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.322662 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.377392 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.407653 4760 generic.go:334] "Generic (PLEG): container finished" podID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerID="be0ab6c872329b843e040e8eed3448d7eb4a700b2a4eb94950921605d6a3c731" exitCode=0 Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.407708 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerDied","Data":"be0ab6c872329b843e040e8eed3448d7eb4a700b2a4eb94950921605d6a3c731"} Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.409870 4760 generic.go:334] "Generic (PLEG): container finished" podID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerID="700931db30deb9fb5117907c4911c18937338cb33cece5e0cff14cf69bacff87" exitCode=0 Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.409984 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerDied","Data":"700931db30deb9fb5117907c4911c18937338cb33cece5e0cff14cf69bacff87"} Dec 04 12:31:16 crc kubenswrapper[4760]: I1204 12:31:16.468690 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:18 crc kubenswrapper[4760]: I1204 12:31:18.986467 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:18 crc kubenswrapper[4760]: I1204 12:31:18.987502 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cvcck" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="registry-server" containerID="cri-o://8a592d69905d3e31bb390b269f0c810a0b8771d856f083a595a49108627c1430" gracePeriod=2 Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.202077 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.394141 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content\") pod \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.394291 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt5vp\" (UniqueName: \"kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp\") pod \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.394349 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities\") pod \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\" (UID: \"7ef784e2-c0eb-4595-8f77-ebef054e2eb9\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.395934 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities" (OuterVolumeSpecName: "utilities") pod "7ef784e2-c0eb-4595-8f77-ebef054e2eb9" (UID: "7ef784e2-c0eb-4595-8f77-ebef054e2eb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.402662 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp" (OuterVolumeSpecName: "kube-api-access-jt5vp") pod "7ef784e2-c0eb-4595-8f77-ebef054e2eb9" (UID: "7ef784e2-c0eb-4595-8f77-ebef054e2eb9"). InnerVolumeSpecName "kube-api-access-jt5vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.447885 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdmwf" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.447976 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdmwf" event={"ID":"7ef784e2-c0eb-4595-8f77-ebef054e2eb9","Type":"ContainerDied","Data":"3d5e2c781ac60397458aa44e1f9c030b7f0b7e45733adc7e6bf3b1f46e20ec24"} Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.448071 4760 scope.go:117] "RemoveContainer" containerID="be0ab6c872329b843e040e8eed3448d7eb4a700b2a4eb94950921605d6a3c731" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.458358 4760 generic.go:334] "Generic (PLEG): container finished" podID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerID="8a592d69905d3e31bb390b269f0c810a0b8771d856f083a595a49108627c1430" exitCode=0 Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.458456 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerDied","Data":"8a592d69905d3e31bb390b269f0c810a0b8771d856f083a595a49108627c1430"} Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.466334 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ef784e2-c0eb-4595-8f77-ebef054e2eb9" (UID: "7ef784e2-c0eb-4595-8f77-ebef054e2eb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.496199 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.496699 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt5vp\" (UniqueName: \"kubernetes.io/projected/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-kube-api-access-jt5vp\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.496787 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef784e2-c0eb-4595-8f77-ebef054e2eb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.539989 4760 scope.go:117] "RemoveContainer" containerID="17e80fabf3efd7893290d66ea4a71f639fddd56157a03ea0f81efecb9e6d83f7" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.565470 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.598720 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities\") pod \"10e3e98a-e7b6-4433-851a-bc316b49143c\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.599334 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content\") pod \"10e3e98a-e7b6-4433-851a-bc316b49143c\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.599452 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l29df\" (UniqueName: \"kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df\") pod \"10e3e98a-e7b6-4433-851a-bc316b49143c\" (UID: \"10e3e98a-e7b6-4433-851a-bc316b49143c\") " Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.599714 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities" (OuterVolumeSpecName: "utilities") pod "10e3e98a-e7b6-4433-851a-bc316b49143c" (UID: "10e3e98a-e7b6-4433-851a-bc316b49143c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.600299 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.604441 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df" (OuterVolumeSpecName: "kube-api-access-l29df") pod "10e3e98a-e7b6-4433-851a-bc316b49143c" (UID: "10e3e98a-e7b6-4433-851a-bc316b49143c"). InnerVolumeSpecName "kube-api-access-l29df". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.624627 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10e3e98a-e7b6-4433-851a-bc316b49143c" (UID: "10e3e98a-e7b6-4433-851a-bc316b49143c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.702435 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e3e98a-e7b6-4433-851a-bc316b49143c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.703008 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l29df\" (UniqueName: \"kubernetes.io/projected/10e3e98a-e7b6-4433-851a-bc316b49143c-kube-api-access-l29df\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.746264 4760 scope.go:117] "RemoveContainer" containerID="702eb2a4f4c4fecc9cbbf3e005ec8350bbb3c63d6c3830be71dff1b0895db27a" Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.778832 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.787791 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdmwf"] Dec 04 12:31:19 crc kubenswrapper[4760]: I1204 12:31:19.881631 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" path="/var/lib/kubelet/pods/7ef784e2-c0eb-4595-8f77-ebef054e2eb9/volumes" Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.469777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvcck" event={"ID":"10e3e98a-e7b6-4433-851a-bc316b49143c","Type":"ContainerDied","Data":"fc0d25833f02d8b3d41cc34c36fddc17f5f2c3d8907f82181c0582255a9ade90"} Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.469821 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvcck" Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.469887 4760 scope.go:117] "RemoveContainer" containerID="8a592d69905d3e31bb390b269f0c810a0b8771d856f083a595a49108627c1430" Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.473250 4760 generic.go:334] "Generic (PLEG): container finished" podID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerID="be8879529be865686390cb3087543fa314801e2359da87de5d8265ab9b066fdf" exitCode=0 Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.473464 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerDied","Data":"be8879529be865686390cb3087543fa314801e2359da87de5d8265ab9b066fdf"} Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.488454 4760 scope.go:117] "RemoveContainer" containerID="20e3c198235effd9996601247bc5d4d8e4d6495aa16a166977b50bbb75d56d70" Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.533581 4760 scope.go:117] "RemoveContainer" containerID="a7669faabf9812dfd9e5bdc66c66b3a1e7afadbcc9045990334254e8d0f124d3" Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.539806 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:20 crc kubenswrapper[4760]: I1204 12:31:20.546288 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvcck"] Dec 04 12:31:21 crc kubenswrapper[4760]: I1204 12:31:21.490717 4760 generic.go:334] "Generic (PLEG): container finished" podID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerID="f16df1a0164470da53f98d10ca3f0fe2ded3add2e8a00b93fdd7dadf7473045d" exitCode=0 Dec 04 12:31:21 crc kubenswrapper[4760]: I1204 12:31:21.490803 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerDied","Data":"f16df1a0164470da53f98d10ca3f0fe2ded3add2e8a00b93fdd7dadf7473045d"} Dec 04 12:31:21 crc kubenswrapper[4760]: I1204 12:31:21.875350 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" path="/var/lib/kubelet/pods/10e3e98a-e7b6-4433-851a-bc316b49143c/volumes" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.489822 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n4b8p" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.857046 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.887113 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle\") pod \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.887944 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle" (OuterVolumeSpecName: "bundle") pod "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" (UID: "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.887993 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util\") pod \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.889805 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzsg\" (UniqueName: \"kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg\") pod \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\" (UID: \"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187\") " Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.892030 4760 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.901864 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg" (OuterVolumeSpecName: "kube-api-access-grzsg") pod "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" (UID: "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187"). InnerVolumeSpecName "kube-api-access-grzsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.916485 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util" (OuterVolumeSpecName: "util") pod "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" (UID: "e5ca5bd9-ed9a-4568-8b09-57e66e9ad187"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.993868 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzsg\" (UniqueName: \"kubernetes.io/projected/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-kube-api-access-grzsg\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:22 crc kubenswrapper[4760]: I1204 12:31:22.993919 4760 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e5ca5bd9-ed9a-4568-8b09-57e66e9ad187-util\") on node \"crc\" DevicePath \"\"" Dec 04 12:31:23 crc kubenswrapper[4760]: I1204 12:31:23.519073 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" event={"ID":"e5ca5bd9-ed9a-4568-8b09-57e66e9ad187","Type":"ContainerDied","Data":"6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1"} Dec 04 12:31:23 crc kubenswrapper[4760]: I1204 12:31:23.519136 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a90282e490b9913cc0419d75956385b78b2f7bde788620298327295e8fc6cf1" Dec 04 12:31:23 crc kubenswrapper[4760]: I1204 12:31:23.519291 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.438124 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz"] Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439829 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="extract-content" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439857 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="extract-content" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439876 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439885 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439909 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="util" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439918 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="util" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439932 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="extract-content" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439940 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="extract-content" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439955 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="extract" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439962 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="extract" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.439977 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.439986 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.440001 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="pull" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440008 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="pull" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.440020 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="extract-utilities" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440029 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="extract-utilities" Dec 04 12:31:25 crc kubenswrapper[4760]: E1204 12:31:25.440042 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="extract-utilities" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440052 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="extract-utilities" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440280 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef784e2-c0eb-4595-8f77-ebef054e2eb9" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440299 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ca5bd9-ed9a-4568-8b09-57e66e9ad187" containerName="extract" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.440318 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e3e98a-e7b6-4433-851a-bc316b49143c" containerName="registry-server" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.441232 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.447548 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fwskr" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.472373 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz"] Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.549814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8qx8\" (UniqueName: \"kubernetes.io/projected/91a1898b-cdb0-4f97-9bc0-242d1980bd8c-kube-api-access-m8qx8\") pod \"openstack-operator-controller-operator-645cc8bbd-8pfbz\" (UID: \"91a1898b-cdb0-4f97-9bc0-242d1980bd8c\") " pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.652039 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8qx8\" (UniqueName: \"kubernetes.io/projected/91a1898b-cdb0-4f97-9bc0-242d1980bd8c-kube-api-access-m8qx8\") pod \"openstack-operator-controller-operator-645cc8bbd-8pfbz\" (UID: \"91a1898b-cdb0-4f97-9bc0-242d1980bd8c\") " pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.673532 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8qx8\" (UniqueName: \"kubernetes.io/projected/91a1898b-cdb0-4f97-9bc0-242d1980bd8c-kube-api-access-m8qx8\") pod \"openstack-operator-controller-operator-645cc8bbd-8pfbz\" (UID: \"91a1898b-cdb0-4f97-9bc0-242d1980bd8c\") " pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:25 crc kubenswrapper[4760]: I1204 12:31:25.768441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:26 crc kubenswrapper[4760]: I1204 12:31:26.054027 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz"] Dec 04 12:31:26 crc kubenswrapper[4760]: W1204 12:31:26.069697 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a1898b_cdb0_4f97_9bc0_242d1980bd8c.slice/crio-bf0254ef26f0f248f21ad20812ecc26dfc23c6b2ccdeb62c85c3e7be629eedb4 WatchSource:0}: Error finding container bf0254ef26f0f248f21ad20812ecc26dfc23c6b2ccdeb62c85c3e7be629eedb4: Status 404 returned error can't find the container with id bf0254ef26f0f248f21ad20812ecc26dfc23c6b2ccdeb62c85c3e7be629eedb4 Dec 04 12:31:26 crc kubenswrapper[4760]: I1204 12:31:26.549821 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" event={"ID":"91a1898b-cdb0-4f97-9bc0-242d1980bd8c","Type":"ContainerStarted","Data":"bf0254ef26f0f248f21ad20812ecc26dfc23c6b2ccdeb62c85c3e7be629eedb4"} Dec 04 12:31:31 crc kubenswrapper[4760]: I1204 12:31:31.600355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" event={"ID":"91a1898b-cdb0-4f97-9bc0-242d1980bd8c","Type":"ContainerStarted","Data":"854c9def7ae035a7833144bef17ee66b513abccfac12a5c6f9be7716f821faa9"} Dec 04 12:31:31 crc kubenswrapper[4760]: I1204 12:31:31.601391 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:31:31 crc kubenswrapper[4760]: I1204 12:31:31.638360 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" podStartSLOduration=1.5192048580000002 podStartE2EDuration="6.638326173s" podCreationTimestamp="2025-12-04 12:31:25 +0000 UTC" firstStartedPulling="2025-12-04 12:31:26.071936268 +0000 UTC m=+1089.113382835" lastFinishedPulling="2025-12-04 12:31:31.191057573 +0000 UTC m=+1094.232504150" observedRunningTime="2025-12-04 12:31:31.633125468 +0000 UTC m=+1094.674572035" watchObservedRunningTime="2025-12-04 12:31:31.638326173 +0000 UTC m=+1094.679772750" Dec 04 12:31:45 crc kubenswrapper[4760]: I1204 12:31:45.771753 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-645cc8bbd-8pfbz" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.560736 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.563965 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.567898 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.570052 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-d49fj" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.578843 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.580349 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhgk\" (UniqueName: \"kubernetes.io/projected/2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd-kube-api-access-7lhgk\") pod \"cinder-operator-controller-manager-859b6ccc6-qgdz6\" (UID: \"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.580488 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdkw\" (UniqueName: \"kubernetes.io/projected/69f6297b-8cdc-4bfc-ba61-1868e7805998-kube-api-access-zjdkw\") pod \"barbican-operator-controller-manager-7d9dfd778-kzkts\" (UID: \"69f6297b-8cdc-4bfc-ba61-1868e7805998\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.583928 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zjwtc" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.593762 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.607335 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.623614 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.624856 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.634453 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h4rrv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.654054 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.657867 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.663994 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qvlmh" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.672003 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.678667 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.685481 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrwf\" (UniqueName: \"kubernetes.io/projected/9093443e-058e-41f0-81ea-9ff8ba566d8a-kube-api-access-tqrwf\") pod \"designate-operator-controller-manager-78b4bc895b-42kkv\" (UID: \"9093443e-058e-41f0-81ea-9ff8ba566d8a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.685835 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhgk\" (UniqueName: \"kubernetes.io/projected/2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd-kube-api-access-7lhgk\") pod \"cinder-operator-controller-manager-859b6ccc6-qgdz6\" (UID: \"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.685978 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr8kb\" (UniqueName: \"kubernetes.io/projected/e8a9a9f4-8e40-4506-9aeb-c3e83d62de39-kube-api-access-jr8kb\") pod \"glance-operator-controller-manager-77987cd8cd-8c7dd\" (UID: \"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.686129 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdkw\" (UniqueName: \"kubernetes.io/projected/69f6297b-8cdc-4bfc-ba61-1868e7805998-kube-api-access-zjdkw\") pod \"barbican-operator-controller-manager-7d9dfd778-kzkts\" (UID: \"69f6297b-8cdc-4bfc-ba61-1868e7805998\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.745942 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.769171 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdkw\" (UniqueName: \"kubernetes.io/projected/69f6297b-8cdc-4bfc-ba61-1868e7805998-kube-api-access-zjdkw\") pod \"barbican-operator-controller-manager-7d9dfd778-kzkts\" (UID: \"69f6297b-8cdc-4bfc-ba61-1868e7805998\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.770160 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhgk\" (UniqueName: \"kubernetes.io/projected/2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd-kube-api-access-7lhgk\") pod \"cinder-operator-controller-manager-859b6ccc6-qgdz6\" (UID: \"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.879540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.882742 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fgpb\" (UniqueName: \"kubernetes.io/projected/abc24b8c-0be3-44c7-b011-5ea10803fdf1-kube-api-access-9fgpb\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfqvw\" (UID: \"abc24b8c-0be3-44c7-b011-5ea10803fdf1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.882903 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrwf\" (UniqueName: \"kubernetes.io/projected/9093443e-058e-41f0-81ea-9ff8ba566d8a-kube-api-access-tqrwf\") pod \"designate-operator-controller-manager-78b4bc895b-42kkv\" (UID: \"9093443e-058e-41f0-81ea-9ff8ba566d8a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.882987 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr8kb\" (UniqueName: \"kubernetes.io/projected/e8a9a9f4-8e40-4506-9aeb-c3e83d62de39-kube-api-access-jr8kb\") pod \"glance-operator-controller-manager-77987cd8cd-8c7dd\" (UID: \"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.888978 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gpfhb" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.889615 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.894311 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.896304 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.909756 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xgk5f" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.914432 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.915942 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.919800 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2dbkw" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.920459 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.927241 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.939893 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr8kb\" (UniqueName: \"kubernetes.io/projected/e8a9a9f4-8e40-4506-9aeb-c3e83d62de39-kube-api-access-jr8kb\") pod \"glance-operator-controller-manager-77987cd8cd-8c7dd\" (UID: \"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.946643 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.948590 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.958580 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.960049 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.966600 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xls9v" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.973372 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrwf\" (UniqueName: \"kubernetes.io/projected/9093443e-058e-41f0-81ea-9ff8ba566d8a-kube-api-access-tqrwf\") pod \"designate-operator-controller-manager-78b4bc895b-42kkv\" (UID: \"9093443e-058e-41f0-81ea-9ff8ba566d8a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.977787 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.984417 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fgpb\" (UniqueName: \"kubernetes.io/projected/abc24b8c-0be3-44c7-b011-5ea10803fdf1-kube-api-access-9fgpb\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfqvw\" (UID: \"abc24b8c-0be3-44c7-b011-5ea10803fdf1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.987193 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd"] Dec 04 12:32:05 crc kubenswrapper[4760]: I1204 12:32:05.996557 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.012924 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.012982 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.013969 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.014571 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.014886 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fgpb\" (UniqueName: \"kubernetes.io/projected/abc24b8c-0be3-44c7-b011-5ea10803fdf1-kube-api-access-9fgpb\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfqvw\" (UID: \"abc24b8c-0be3-44c7-b011-5ea10803fdf1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.016493 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-bgcvd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.016799 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dvhns" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.023861 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.025991 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.029937 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-m25dv" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.079854 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.087133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklt6\" (UniqueName: \"kubernetes.io/projected/73caa66c-d120-4b70-b417-d7f363ce6236-kube-api-access-fklt6\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.087249 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlss\" (UniqueName: \"kubernetes.io/projected/4018757f-a398-4734-9a4e-b6cc11327b9f-kube-api-access-7dlss\") pod \"horizon-operator-controller-manager-68c6d99b8f-5jtgd\" (UID: \"4018757f-a398-4734-9a4e-b6cc11327b9f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.087288 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.087379 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtdd\" (UniqueName: \"kubernetes.io/projected/ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a-kube-api-access-xvtdd\") pod \"ironic-operator-controller-manager-6c548fd776-thpz6\" (UID: \"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.113067 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.163075 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.192818 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8qv\" (UniqueName: \"kubernetes.io/projected/b7918a8b-7f47-4d71-820b-95156b273357-kube-api-access-4r8qv\") pod \"manila-operator-controller-manager-7c79b5df47-g64p8\" (UID: \"b7918a8b-7f47-4d71-820b-95156b273357\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.192903 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlss\" (UniqueName: \"kubernetes.io/projected/4018757f-a398-4734-9a4e-b6cc11327b9f-kube-api-access-7dlss\") pod \"horizon-operator-controller-manager-68c6d99b8f-5jtgd\" (UID: \"4018757f-a398-4734-9a4e-b6cc11327b9f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.192958 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.193009 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8w9\" (UniqueName: \"kubernetes.io/projected/546bf075-78c1-4324-ba9a-80ac8df0c4f7-kube-api-access-zc8w9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m9phl\" (UID: \"546bf075-78c1-4324-ba9a-80ac8df0c4f7\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.193073 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtdd\" (UniqueName: \"kubernetes.io/projected/ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a-kube-api-access-xvtdd\") pod \"ironic-operator-controller-manager-6c548fd776-thpz6\" (UID: \"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.193123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fklt6\" (UniqueName: \"kubernetes.io/projected/73caa66c-d120-4b70-b417-d7f363ce6236-kube-api-access-fklt6\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.193145 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pjq\" (UniqueName: \"kubernetes.io/projected/35ad36be-f7a9-4ca8-bd29-0d5ccd658c53-kube-api-access-m7pjq\") pod \"keystone-operator-controller-manager-7765d96ddf-4866m\" (UID: \"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.193669 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.193729 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert podName:73caa66c-d120-4b70-b417-d7f363ce6236 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:06.693705011 +0000 UTC m=+1129.735151578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert") pod "infra-operator-controller-manager-57548d458d-2ffsj" (UID: "73caa66c-d120-4b70-b417-d7f363ce6236") : secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.214289 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.237740 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.238124 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlss\" (UniqueName: \"kubernetes.io/projected/4018757f-a398-4734-9a4e-b6cc11327b9f-kube-api-access-7dlss\") pod \"horizon-operator-controller-manager-68c6d99b8f-5jtgd\" (UID: \"4018757f-a398-4734-9a4e-b6cc11327b9f\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.238578 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.239995 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.257812 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtdd\" (UniqueName: \"kubernetes.io/projected/ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a-kube-api-access-xvtdd\") pod \"ironic-operator-controller-manager-6c548fd776-thpz6\" (UID: \"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.261277 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5xgnz" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.279039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklt6\" (UniqueName: \"kubernetes.io/projected/73caa66c-d120-4b70-b417-d7f363ce6236-kube-api-access-fklt6\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.295417 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pjq\" (UniqueName: \"kubernetes.io/projected/35ad36be-f7a9-4ca8-bd29-0d5ccd658c53-kube-api-access-m7pjq\") pod \"keystone-operator-controller-manager-7765d96ddf-4866m\" (UID: \"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.295941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8qv\" (UniqueName: \"kubernetes.io/projected/b7918a8b-7f47-4d71-820b-95156b273357-kube-api-access-4r8qv\") pod \"manila-operator-controller-manager-7c79b5df47-g64p8\" (UID: \"b7918a8b-7f47-4d71-820b-95156b273357\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.296288 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc8w9\" (UniqueName: \"kubernetes.io/projected/546bf075-78c1-4324-ba9a-80ac8df0c4f7-kube-api-access-zc8w9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m9phl\" (UID: \"546bf075-78c1-4324-ba9a-80ac8df0c4f7\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.296466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lwxg\" (UniqueName: \"kubernetes.io/projected/282a54d8-5318-49e0-aefe-a86a7a8d63ac-kube-api-access-7lwxg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x5zgc\" (UID: \"282a54d8-5318-49e0-aefe-a86a7a8d63ac\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.321854 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8qv\" (UniqueName: \"kubernetes.io/projected/b7918a8b-7f47-4d71-820b-95156b273357-kube-api-access-4r8qv\") pod \"manila-operator-controller-manager-7c79b5df47-g64p8\" (UID: \"b7918a8b-7f47-4d71-820b-95156b273357\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.322800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pjq\" (UniqueName: \"kubernetes.io/projected/35ad36be-f7a9-4ca8-bd29-0d5ccd658c53-kube-api-access-m7pjq\") pod \"keystone-operator-controller-manager-7765d96ddf-4866m\" (UID: \"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.330493 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.348700 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.354823 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc8w9\" (UniqueName: \"kubernetes.io/projected/546bf075-78c1-4324-ba9a-80ac8df0c4f7-kube-api-access-zc8w9\") pod \"mariadb-operator-controller-manager-56bbcc9d85-m9phl\" (UID: \"546bf075-78c1-4324-ba9a-80ac8df0c4f7\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.364138 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-slr7f"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.365933 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.376676 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.378546 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.387696 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.393368 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-slr7f"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.398779 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.400661 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.401162 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.402350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jd7\" (UniqueName: \"kubernetes.io/projected/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-kube-api-access-q6jd7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.402538 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.402625 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lwxg\" (UniqueName: \"kubernetes.io/projected/282a54d8-5318-49e0-aefe-a86a7a8d63ac-kube-api-access-7lwxg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x5zgc\" (UID: \"282a54d8-5318-49e0-aefe-a86a7a8d63ac\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.402691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7jg\" (UniqueName: \"kubernetes.io/projected/f13ef420-321e-40f8-90d2-e6fdcbb72752-kube-api-access-6t7jg\") pod \"octavia-operator-controller-manager-998648c74-slr7f\" (UID: \"f13ef420-321e-40f8-90d2-e6fdcbb72752\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.404928 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.406927 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.413976 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.420578 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.432022 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.462506 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.505590 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.506122 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7jg\" (UniqueName: \"kubernetes.io/projected/f13ef420-321e-40f8-90d2-e6fdcbb72752-kube-api-access-6t7jg\") pod \"octavia-operator-controller-manager-998648c74-slr7f\" (UID: \"f13ef420-321e-40f8-90d2-e6fdcbb72752\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.506247 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jd7\" (UniqueName: \"kubernetes.io/projected/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-kube-api-access-q6jd7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.510885 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pjwvl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.530271 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.533974 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-skf7n" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.534201 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.537030 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-49lhb" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.547481 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.565082 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.578083 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert podName:ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:07.078013364 +0000 UTC m=+1130.119459951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" (UID: "ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.611462 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8w52m" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.614764 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.662689 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2kld\" (UniqueName: \"kubernetes.io/projected/a01a242a-291b-4281-a331-91c05efcdf87-kube-api-access-g2kld\") pod \"nova-operator-controller-manager-697bc559fc-57pzb\" (UID: \"a01a242a-291b-4281-a331-91c05efcdf87\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.691437 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lwxg\" (UniqueName: \"kubernetes.io/projected/282a54d8-5318-49e0-aefe-a86a7a8d63ac-kube-api-access-7lwxg\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x5zgc\" (UID: \"282a54d8-5318-49e0-aefe-a86a7a8d63ac\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.680357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jmpz\" (UniqueName: \"kubernetes.io/projected/21a55b1d-ebff-4abd-a556-d272a1753a5b-kube-api-access-7jmpz\") pod \"placement-operator-controller-manager-78f8948974-ss2gd\" (UID: \"21a55b1d-ebff-4abd-a556-d272a1753a5b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.709242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6d9x\" (UniqueName: \"kubernetes.io/projected/97874450-ee94-4963-aa10-a58295edae62-kube-api-access-r6d9x\") pod \"ovn-operator-controller-manager-b6456fdb6-zxmkv\" (UID: \"97874450-ee94-4963-aa10-a58295edae62\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.803939 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6wf7x" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.857299 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7jg\" (UniqueName: \"kubernetes.io/projected/f13ef420-321e-40f8-90d2-e6fdcbb72752-kube-api-access-6t7jg\") pod \"octavia-operator-controller-manager-998648c74-slr7f\" (UID: \"f13ef420-321e-40f8-90d2-e6fdcbb72752\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.860289 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jd7\" (UniqueName: \"kubernetes.io/projected/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-kube-api-access-q6jd7\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.939717 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.940553 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.945626 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6d9x\" (UniqueName: \"kubernetes.io/projected/97874450-ee94-4963-aa10-a58295edae62-kube-api-access-r6d9x\") pod \"ovn-operator-controller-manager-b6456fdb6-zxmkv\" (UID: \"97874450-ee94-4963-aa10-a58295edae62\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.945798 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2kld\" (UniqueName: \"kubernetes.io/projected/a01a242a-291b-4281-a331-91c05efcdf87-kube-api-access-g2kld\") pod \"nova-operator-controller-manager-697bc559fc-57pzb\" (UID: \"a01a242a-291b-4281-a331-91c05efcdf87\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.945941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.946050 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jmpz\" (UniqueName: \"kubernetes.io/projected/21a55b1d-ebff-4abd-a556-d272a1753a5b-kube-api-access-7jmpz\") pod \"placement-operator-controller-manager-78f8948974-ss2gd\" (UID: \"21a55b1d-ebff-4abd-a556-d272a1753a5b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.947140 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: E1204 12:32:06.947188 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert podName:73caa66c-d120-4b70-b417-d7f363ce6236 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:07.947170914 +0000 UTC m=+1130.988617481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert") pod "infra-operator-controller-manager-57548d458d-2ffsj" (UID: "73caa66c-d120-4b70-b417-d7f363ce6236") : secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.951790 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.955461 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.963762 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.963871 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.967316 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.968918 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4k89n" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.979483 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nbm7q" Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.979845 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.990404 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g22dq"] Dec 04 12:32:06 crc kubenswrapper[4760]: I1204 12:32:06.994902 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.005144 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8cjxq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.009744 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.014923 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.018735 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.041683 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rzhvs" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.050160 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvq4j\" (UniqueName: \"kubernetes.io/projected/fcf368e1-183d-445d-b3b7-dfd4f08fddcd-kube-api-access-kvq4j\") pod \"swift-operator-controller-manager-5f8c65bbfc-q9nzq\" (UID: \"fcf368e1-183d-445d-b3b7-dfd4f08fddcd\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.050323 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tmx\" (UniqueName: \"kubernetes.io/projected/7c8092df-4a88-4a8c-a400-6435f525a5ec-kube-api-access-b2tmx\") pod \"watcher-operator-controller-manager-769dc69bc-wwqps\" (UID: \"7c8092df-4a88-4a8c-a400-6435f525a5ec\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.050703 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzgx\" (UniqueName: \"kubernetes.io/projected/750e95e1-6019-4779-a2c0-4abcce4b1c8c-kube-api-access-wdzgx\") pod \"test-operator-controller-manager-5854674fcc-g22dq\" (UID: \"750e95e1-6019-4779-a2c0-4abcce4b1c8c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.050803 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsv4\" (UniqueName: \"kubernetes.io/projected/a96cd5b4-6668-4815-b121-777fe0e65833-kube-api-access-zrsv4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-twx55\" (UID: \"a96cd5b4-6668-4815-b121-777fe0e65833\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.076909 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g22dq"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.138999 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.152123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvq4j\" (UniqueName: \"kubernetes.io/projected/fcf368e1-183d-445d-b3b7-dfd4f08fddcd-kube-api-access-kvq4j\") pod \"swift-operator-controller-manager-5f8c65bbfc-q9nzq\" (UID: \"fcf368e1-183d-445d-b3b7-dfd4f08fddcd\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.152282 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.152314 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tmx\" (UniqueName: \"kubernetes.io/projected/7c8092df-4a88-4a8c-a400-6435f525a5ec-kube-api-access-b2tmx\") pod \"watcher-operator-controller-manager-769dc69bc-wwqps\" (UID: \"7c8092df-4a88-4a8c-a400-6435f525a5ec\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.152410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzgx\" (UniqueName: \"kubernetes.io/projected/750e95e1-6019-4779-a2c0-4abcce4b1c8c-kube-api-access-wdzgx\") pod \"test-operator-controller-manager-5854674fcc-g22dq\" (UID: \"750e95e1-6019-4779-a2c0-4abcce4b1c8c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.152440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsv4\" (UniqueName: \"kubernetes.io/projected/a96cd5b4-6668-4815-b121-777fe0e65833-kube-api-access-zrsv4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-twx55\" (UID: \"a96cd5b4-6668-4815-b121-777fe0e65833\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.152870 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.152921 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert podName:ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:08.152906136 +0000 UTC m=+1131.194352703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" (UID: "ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.168724 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6d9x\" (UniqueName: \"kubernetes.io/projected/97874450-ee94-4963-aa10-a58295edae62-kube-api-access-r6d9x\") pod \"ovn-operator-controller-manager-b6456fdb6-zxmkv\" (UID: \"97874450-ee94-4963-aa10-a58295edae62\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.169815 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.170806 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.186380 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2kld\" (UniqueName: \"kubernetes.io/projected/a01a242a-291b-4281-a331-91c05efcdf87-kube-api-access-g2kld\") pod \"nova-operator-controller-manager-697bc559fc-57pzb\" (UID: \"a01a242a-291b-4281-a331-91c05efcdf87\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.187298 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.190330 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jmpz\" (UniqueName: \"kubernetes.io/projected/21a55b1d-ebff-4abd-a556-d272a1753a5b-kube-api-access-7jmpz\") pod \"placement-operator-controller-manager-78f8948974-ss2gd\" (UID: \"21a55b1d-ebff-4abd-a556-d272a1753a5b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.196390 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.196409 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.196882 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-v8dwg" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.208683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tmx\" (UniqueName: \"kubernetes.io/projected/7c8092df-4a88-4a8c-a400-6435f525a5ec-kube-api-access-b2tmx\") pod \"watcher-operator-controller-manager-769dc69bc-wwqps\" (UID: \"7c8092df-4a88-4a8c-a400-6435f525a5ec\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.212425 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.228036 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzgx\" (UniqueName: \"kubernetes.io/projected/750e95e1-6019-4779-a2c0-4abcce4b1c8c-kube-api-access-wdzgx\") pod \"test-operator-controller-manager-5854674fcc-g22dq\" (UID: \"750e95e1-6019-4779-a2c0-4abcce4b1c8c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.243317 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvq4j\" (UniqueName: \"kubernetes.io/projected/fcf368e1-183d-445d-b3b7-dfd4f08fddcd-kube-api-access-kvq4j\") pod \"swift-operator-controller-manager-5f8c65bbfc-q9nzq\" (UID: \"fcf368e1-183d-445d-b3b7-dfd4f08fddcd\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.254546 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsv4\" (UniqueName: \"kubernetes.io/projected/a96cd5b4-6668-4815-b121-777fe0e65833-kube-api-access-zrsv4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-twx55\" (UID: \"a96cd5b4-6668-4815-b121-777fe0e65833\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.304893 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.321191 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.336945 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.349686 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.377298 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.377507 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vq4c\" (UniqueName: \"kubernetes.io/projected/bfa893f7-8101-4fd1-ae93-94688b827e95-kube-api-access-7vq4c\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.377637 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.392024 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kj6tm" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.394589 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.411237 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.436646 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.451022 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9"] Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.482886 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.483923 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.484017 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vq4c\" (UniqueName: \"kubernetes.io/projected/bfa893f7-8101-4fd1-ae93-94688b827e95-kube-api-access-7vq4c\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.484054 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.484103 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl22k\" (UniqueName: \"kubernetes.io/projected/35d340c4-abab-4dc8-8ba4-e8740d6b89d4-kube-api-access-xl22k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pptq9\" (UID: \"35d340c4-abab-4dc8-8ba4-e8740d6b89d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.484656 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.484701 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:07.984684961 +0000 UTC m=+1131.026131528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.484859 4760 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.484915 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:07.984893537 +0000 UTC m=+1131.026340104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "metrics-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.531539 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vq4c\" (UniqueName: \"kubernetes.io/projected/bfa893f7-8101-4fd1-ae93-94688b827e95-kube-api-access-7vq4c\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.585609 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl22k\" (UniqueName: \"kubernetes.io/projected/35d340c4-abab-4dc8-8ba4-e8740d6b89d4-kube-api-access-xl22k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pptq9\" (UID: \"35d340c4-abab-4dc8-8ba4-e8740d6b89d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.663922 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl22k\" (UniqueName: \"kubernetes.io/projected/35d340c4-abab-4dc8-8ba4-e8740d6b89d4-kube-api-access-xl22k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pptq9\" (UID: \"35d340c4-abab-4dc8-8ba4-e8740d6b89d4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.790866 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.995838 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.995928 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:07 crc kubenswrapper[4760]: I1204 12:32:07.996051 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996366 4760 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996450 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:08.996423708 +0000 UTC m=+1132.037870275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "metrics-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996458 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996665 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:08.996626905 +0000 UTC m=+1132.038073472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996789 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:07 crc kubenswrapper[4760]: E1204 12:32:07.996826 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert podName:73caa66c-d120-4b70-b417-d7f363ce6236 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:09.996814011 +0000 UTC m=+1133.038260578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert") pod "infra-operator-controller-manager-57548d458d-2ffsj" (UID: "73caa66c-d120-4b70-b417-d7f363ce6236") : secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.186874 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv"] Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.202632 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:08 crc kubenswrapper[4760]: E1204 12:32:08.203055 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:08 crc kubenswrapper[4760]: E1204 12:32:08.203274 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert podName:ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:10.203258205 +0000 UTC m=+1133.244704772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" (UID: "ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.294752 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6"] Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.302056 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd"] Dec 04 12:32:08 crc kubenswrapper[4760]: W1204 12:32:08.339490 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9093443e_058e_41f0_81ea_9ff8ba566d8a.slice/crio-9c4fdd5af334967bd2b7712e02bd6757d0df865ff41ab9e87d533c112ed1036c WatchSource:0}: Error finding container 9c4fdd5af334967bd2b7712e02bd6757d0df865ff41ab9e87d533c112ed1036c: Status 404 returned error can't find the container with id 9c4fdd5af334967bd2b7712e02bd6757d0df865ff41ab9e87d533c112ed1036c Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.354302 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.371134 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m"] Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.390079 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd"] Dec 04 12:32:08 crc kubenswrapper[4760]: W1204 12:32:08.416836 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4018757f_a398_4734_9a4e_b6cc11327b9f.slice/crio-5b1c49fef170747010d78f9f702aab7d782bc1ae408545e07746ea4d42bada29 WatchSource:0}: Error finding container 5b1c49fef170747010d78f9f702aab7d782bc1ae408545e07746ea4d42bada29: Status 404 returned error can't find the container with id 5b1c49fef170747010d78f9f702aab7d782bc1ae408545e07746ea4d42bada29 Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.784102 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts"] Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.843032 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6"] Dec 04 12:32:08 crc kubenswrapper[4760]: W1204 12:32:08.845055 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3c9d7d_086a_4b0a_bf4a_f5381c283f0a.slice/crio-4903fd3a05028dccd8d0be7d1bbb4480b128770d81a00a16fee0b4e39bc32a62 WatchSource:0}: Error finding container 4903fd3a05028dccd8d0be7d1bbb4480b128770d81a00a16fee0b4e39bc32a62: Status 404 returned error can't find the container with id 4903fd3a05028dccd8d0be7d1bbb4480b128770d81a00a16fee0b4e39bc32a62 Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.870566 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8"] Dec 04 12:32:08 crc kubenswrapper[4760]: I1204 12:32:08.984226 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.025492 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.025607 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.025668 4760 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.025750 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:11.025728699 +0000 UTC m=+1134.067175266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "metrics-server-cert" not found Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.025776 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.025838 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:11.025819532 +0000 UTC m=+1134.067266289 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "webhook-server-cert" not found Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.040829 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" event={"ID":"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53","Type":"ContainerStarted","Data":"6784645e6984be5e0de389c8c3d6e3d86b2089d47c332d7c0635e45de7b85451"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.043461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" event={"ID":"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd","Type":"ContainerStarted","Data":"0329011c65b1163a9ef9723b67ae54bd8bce590c37cee393a034ee54be28904c"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.045162 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" event={"ID":"69f6297b-8cdc-4bfc-ba61-1868e7805998","Type":"ContainerStarted","Data":"0548b1d1c868386dfd5a5af9366c5a430867ffc4f006e8a399cff7fe7594f065"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.046928 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" event={"ID":"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a","Type":"ContainerStarted","Data":"4903fd3a05028dccd8d0be7d1bbb4480b128770d81a00a16fee0b4e39bc32a62"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.048549 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" event={"ID":"abc24b8c-0be3-44c7-b011-5ea10803fdf1","Type":"ContainerStarted","Data":"1e57e52f0237b01735f621e62b6628bed1864cd783b9fe424efffec2f93ef617"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.050798 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" event={"ID":"b7918a8b-7f47-4d71-820b-95156b273357","Type":"ContainerStarted","Data":"6f6ebc5ed4a4c579372022ea3f0c08ec5d98dc100d586de4da18b7826bb012cb"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.052063 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" event={"ID":"9093443e-058e-41f0-81ea-9ff8ba566d8a","Type":"ContainerStarted","Data":"9c4fdd5af334967bd2b7712e02bd6757d0df865ff41ab9e87d533c112ed1036c"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.054059 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" event={"ID":"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39","Type":"ContainerStarted","Data":"08ee7bf9fe9dd4b7ceb824b0ea64b9cec083355037b519cb7652d39b565e6db7"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.055389 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" event={"ID":"4018757f-a398-4734-9a4e-b6cc11327b9f","Type":"ContainerStarted","Data":"5b1c49fef170747010d78f9f702aab7d782bc1ae408545e07746ea4d42bada29"} Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.093894 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.108893 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-slr7f"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.191579 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.202143 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.231066 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc"] Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.231901 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97874450_ee94_4963_aa10_a58295edae62.slice/crio-c16df851ad91cab659ecdb4eb3e5b6c86d8043feabde3a180cc0537c0c2d8e99 WatchSource:0}: Error finding container c16df851ad91cab659ecdb4eb3e5b6c86d8043feabde3a180cc0537c0c2d8e99: Status 404 returned error can't find the container with id c16df851ad91cab659ecdb4eb3e5b6c86d8043feabde3a180cc0537c0c2d8e99 Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.232481 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21a55b1d_ebff_4abd_a556_d272a1753a5b.slice/crio-1199ba8e4324aba29402dadd240493b1414d2b32d132a6949d92f0ac5072cfb6 WatchSource:0}: Error finding container 1199ba8e4324aba29402dadd240493b1414d2b32d132a6949d92f0ac5072cfb6: Status 404 returned error can't find the container with id 1199ba8e4324aba29402dadd240493b1414d2b32d132a6949d92f0ac5072cfb6 Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.243594 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps"] Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.255161 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod282a54d8_5318_49e0_aefe_a86a7a8d63ac.slice/crio-30f6dce17499b095c27a6890cfbe8f3fc679a39d2b67fe1222c3f87d99dfaf60 WatchSource:0}: Error finding container 30f6dce17499b095c27a6890cfbe8f3fc679a39d2b67fe1222c3f87d99dfaf60: Status 404 returned error can't find the container with id 30f6dce17499b095c27a6890cfbe8f3fc679a39d2b67fe1222c3f87d99dfaf60 Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.257020 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8092df_4a88_4a8c_a400_6435f525a5ec.slice/crio-21006c4a06ff5c9d5c0b77e2b95b5568fb9007280e4771fec2e4823b23a02b03 WatchSource:0}: Error finding container 21006c4a06ff5c9d5c0b77e2b95b5568fb9007280e4771fec2e4823b23a02b03: Status 404 returned error can't find the container with id 21006c4a06ff5c9d5c0b77e2b95b5568fb9007280e4771fec2e4823b23a02b03 Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.257060 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq"] Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.260074 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2tmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-wwqps_openstack-operators(7c8092df-4a88-4a8c-a400-6435f525a5ec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.262109 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2tmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-wwqps_openstack-operators(7c8092df-4a88-4a8c-a400-6435f525a5ec): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.263352 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podUID="7c8092df-4a88-4a8c-a400-6435f525a5ec" Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.274361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb"] Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.277714 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvq4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-q9nzq_openstack-operators(fcf368e1-183d-445d-b3b7-dfd4f08fddcd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.279816 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01a242a_291b_4281_a331_91c05efcdf87.slice/crio-bb719128e6d7344c1ffed30ed34f71cdc08b07520d1c145bfd9c10dd2e1af85c WatchSource:0}: Error finding container bb719128e6d7344c1ffed30ed34f71cdc08b07520d1c145bfd9c10dd2e1af85c: Status 404 returned error can't find the container with id bb719128e6d7344c1ffed30ed34f71cdc08b07520d1c145bfd9c10dd2e1af85c Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.280080 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvq4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-q9nzq_openstack-operators(fcf368e1-183d-445d-b3b7-dfd4f08fddcd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.281395 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" podUID="fcf368e1-183d-445d-b3b7-dfd4f08fddcd" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.288804 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2kld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-57pzb_openstack-operators(a01a242a-291b-4281-a331-91c05efcdf87): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.296271 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2kld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-57pzb_openstack-operators(a01a242a-291b-4281-a331-91c05efcdf87): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.297583 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" podUID="a01a242a-291b-4281-a331-91c05efcdf87" Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.374405 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-g22dq"] Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.390768 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod750e95e1_6019_4779_a2c0_4abcce4b1c8c.slice/crio-675baf475d524a5a2b3b1abe06bd5fcc07ef6d0deb6bcdad2ae89b8547c4da02 WatchSource:0}: Error finding container 675baf475d524a5a2b3b1abe06bd5fcc07ef6d0deb6bcdad2ae89b8547c4da02: Status 404 returned error can't find the container with id 675baf475d524a5a2b3b1abe06bd5fcc07ef6d0deb6bcdad2ae89b8547c4da02 Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.396615 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55"] Dec 04 12:32:09 crc kubenswrapper[4760]: I1204 12:32:09.405489 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9"] Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.409594 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d340c4_abab_4dc8_8ba4_e8740d6b89d4.slice/crio-5039a24c1bb8ef60641308229373a9ea64bce7746b5a59d77468df52f4c234c2 WatchSource:0}: Error finding container 5039a24c1bb8ef60641308229373a9ea64bce7746b5a59d77468df52f4c234c2: Status 404 returned error can't find the container with id 5039a24c1bb8ef60641308229373a9ea64bce7746b5a59d77468df52f4c234c2 Dec 04 12:32:09 crc kubenswrapper[4760]: W1204 12:32:09.411529 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96cd5b4_6668_4815_b121_777fe0e65833.slice/crio-a323701286e3184e542d78f6df1073c1bde990d3facaea8ccfd9e6e4c5486d7a WatchSource:0}: Error finding container a323701286e3184e542d78f6df1073c1bde990d3facaea8ccfd9e6e4c5486d7a: Status 404 returned error can't find the container with id a323701286e3184e542d78f6df1073c1bde990d3facaea8ccfd9e6e4c5486d7a Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.461953 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xl22k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pptq9_openstack-operators(35d340c4-abab-4dc8-8ba4-e8740d6b89d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.463263 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podUID="35d340c4-abab-4dc8-8ba4-e8740d6b89d4" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.480673 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrsv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-twx55_openstack-operators(a96cd5b4-6668-4815-b121-777fe0e65833): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.483366 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrsv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-twx55_openstack-operators(a96cd5b4-6668-4815-b121-777fe0e65833): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 12:32:09 crc kubenswrapper[4760]: E1204 12:32:09.485028 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" podUID="a96cd5b4-6668-4815-b121-777fe0e65833" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.070552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.070842 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.070968 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert podName:73caa66c-d120-4b70-b417-d7f363ce6236 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:14.070938875 +0000 UTC m=+1137.112385442 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert") pod "infra-operator-controller-manager-57548d458d-2ffsj" (UID: "73caa66c-d120-4b70-b417-d7f363ce6236") : secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.076167 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" event={"ID":"a96cd5b4-6668-4815-b121-777fe0e65833","Type":"ContainerStarted","Data":"a323701286e3184e542d78f6df1073c1bde990d3facaea8ccfd9e6e4c5486d7a"} Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.084340 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" podUID="a96cd5b4-6668-4815-b121-777fe0e65833" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.109831 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" event={"ID":"750e95e1-6019-4779-a2c0-4abcce4b1c8c","Type":"ContainerStarted","Data":"675baf475d524a5a2b3b1abe06bd5fcc07ef6d0deb6bcdad2ae89b8547c4da02"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.125236 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" event={"ID":"a01a242a-291b-4281-a331-91c05efcdf87","Type":"ContainerStarted","Data":"bb719128e6d7344c1ffed30ed34f71cdc08b07520d1c145bfd9c10dd2e1af85c"} Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.146909 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" podUID="a01a242a-291b-4281-a331-91c05efcdf87" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.153027 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" event={"ID":"97874450-ee94-4963-aa10-a58295edae62","Type":"ContainerStarted","Data":"c16df851ad91cab659ecdb4eb3e5b6c86d8043feabde3a180cc0537c0c2d8e99"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.176448 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" event={"ID":"7c8092df-4a88-4a8c-a400-6435f525a5ec","Type":"ContainerStarted","Data":"21006c4a06ff5c9d5c0b77e2b95b5568fb9007280e4771fec2e4823b23a02b03"} Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.179562 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podUID="7c8092df-4a88-4a8c-a400-6435f525a5ec" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.179707 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" event={"ID":"35d340c4-abab-4dc8-8ba4-e8740d6b89d4","Type":"ContainerStarted","Data":"5039a24c1bb8ef60641308229373a9ea64bce7746b5a59d77468df52f4c234c2"} Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.180810 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podUID="35d340c4-abab-4dc8-8ba4-e8740d6b89d4" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.184950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" event={"ID":"546bf075-78c1-4324-ba9a-80ac8df0c4f7","Type":"ContainerStarted","Data":"b83b30b018d7e9e660b699d1f44ec1caa7d2cd692b5fd748eb3d0007a1b58f3a"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.195274 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" event={"ID":"fcf368e1-183d-445d-b3b7-dfd4f08fddcd","Type":"ContainerStarted","Data":"85732902300d4873cf6c1d993a186ee9664dfc93ea1ea53db1aed3bb51a10e05"} Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.205834 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" podUID="fcf368e1-183d-445d-b3b7-dfd4f08fddcd" Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.217374 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" event={"ID":"21a55b1d-ebff-4abd-a556-d272a1753a5b","Type":"ContainerStarted","Data":"1199ba8e4324aba29402dadd240493b1414d2b32d132a6949d92f0ac5072cfb6"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.238020 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" event={"ID":"f13ef420-321e-40f8-90d2-e6fdcbb72752","Type":"ContainerStarted","Data":"b9ec414ebbb99e7a224e274347388eff0aed60f647f0a76f79b1de8af809fe92"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.251454 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" event={"ID":"282a54d8-5318-49e0-aefe-a86a7a8d63ac","Type":"ContainerStarted","Data":"30f6dce17499b095c27a6890cfbe8f3fc679a39d2b67fe1222c3f87d99dfaf60"} Dec 04 12:32:10 crc kubenswrapper[4760]: I1204 12:32:10.274504 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.274649 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:10 crc kubenswrapper[4760]: E1204 12:32:10.274716 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert podName:ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:14.274696434 +0000 UTC m=+1137.316142991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" (UID: "ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:11 crc kubenswrapper[4760]: I1204 12:32:11.131843 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:11 crc kubenswrapper[4760]: I1204 12:32:11.132033 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.132113 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.132289 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:15.132260732 +0000 UTC m=+1138.173707309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "webhook-server-cert" not found Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.132446 4760 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.132543 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:15.132516801 +0000 UTC m=+1138.173963548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "metrics-server-cert" not found Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.391592 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podUID="35d340c4-abab-4dc8-8ba4-e8740d6b89d4" Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.407280 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" podUID="a96cd5b4-6668-4815-b121-777fe0e65833" Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.407674 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podUID="7c8092df-4a88-4a8c-a400-6435f525a5ec" Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.407733 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" podUID="a01a242a-291b-4281-a331-91c05efcdf87" Dec 04 12:32:11 crc kubenswrapper[4760]: E1204 12:32:11.409251 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" podUID="fcf368e1-183d-445d-b3b7-dfd4f08fddcd" Dec 04 12:32:14 crc kubenswrapper[4760]: I1204 12:32:14.146831 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:14 crc kubenswrapper[4760]: E1204 12:32:14.147594 4760 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:14 crc kubenswrapper[4760]: E1204 12:32:14.147831 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert podName:73caa66c-d120-4b70-b417-d7f363ce6236 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:22.147797467 +0000 UTC m=+1145.189244044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert") pod "infra-operator-controller-manager-57548d458d-2ffsj" (UID: "73caa66c-d120-4b70-b417-d7f363ce6236") : secret "infra-operator-webhook-server-cert" not found Dec 04 12:32:14 crc kubenswrapper[4760]: I1204 12:32:14.349773 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:14 crc kubenswrapper[4760]: E1204 12:32:14.350098 4760 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:14 crc kubenswrapper[4760]: E1204 12:32:14.350279 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert podName:ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:22.350242455 +0000 UTC m=+1145.391689022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" (UID: "ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 12:32:15 crc kubenswrapper[4760]: I1204 12:32:15.163546 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:15 crc kubenswrapper[4760]: I1204 12:32:15.163676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:15 crc kubenswrapper[4760]: E1204 12:32:15.163823 4760 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 12:32:15 crc kubenswrapper[4760]: E1204 12:32:15.163886 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:23.163866968 +0000 UTC m=+1146.205313535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "metrics-server-cert" not found Dec 04 12:32:15 crc kubenswrapper[4760]: E1204 12:32:15.164461 4760 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 12:32:15 crc kubenswrapper[4760]: E1204 12:32:15.164532 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs podName:bfa893f7-8101-4fd1-ae93-94688b827e95 nodeName:}" failed. No retries permitted until 2025-12-04 12:32:23.164485728 +0000 UTC m=+1146.205932295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs") pod "openstack-operator-controller-manager-77fb648ff9-cnn8v" (UID: "bfa893f7-8101-4fd1-ae93-94688b827e95") : secret "webhook-server-cert" not found Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.178288 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.183700 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73caa66c-d120-4b70-b417-d7f363ce6236-cert\") pod \"infra-operator-controller-manager-57548d458d-2ffsj\" (UID: \"73caa66c-d120-4b70-b417-d7f363ce6236\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.270196 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2dbkw" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.277718 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.382888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.386530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7\" (UID: \"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.626919 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-49lhb" Dec 04 12:32:22 crc kubenswrapper[4760]: I1204 12:32:22.634007 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:22 crc kubenswrapper[4760]: E1204 12:32:22.963993 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 04 12:32:22 crc kubenswrapper[4760]: E1204 12:32:22.964241 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6t7jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-slr7f_openstack-operators(f13ef420-321e-40f8-90d2-e6fdcbb72752): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.234986 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.235114 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.241098 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-metrics-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.258242 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bfa893f7-8101-4fd1-ae93-94688b827e95-webhook-certs\") pod \"openstack-operator-controller-manager-77fb648ff9-cnn8v\" (UID: \"bfa893f7-8101-4fd1-ae93-94688b827e95\") " pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.361561 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-v8dwg" Dec 04 12:32:23 crc kubenswrapper[4760]: I1204 12:32:23.369389 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:23 crc kubenswrapper[4760]: E1204 12:32:23.601667 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 04 12:32:23 crc kubenswrapper[4760]: E1204 12:32:23.601904 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4r8qv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-g64p8_openstack-operators(b7918a8b-7f47-4d71-820b-95156b273357): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:24 crc kubenswrapper[4760]: E1204 12:32:24.428226 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 04 12:32:24 crc kubenswrapper[4760]: E1204 12:32:24.428773 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wdzgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-g22dq_openstack-operators(750e95e1-6019-4779-a2c0-4abcce4b1c8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:25 crc kubenswrapper[4760]: E1204 12:32:25.234724 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 04 12:32:25 crc kubenswrapper[4760]: E1204 12:32:25.235370 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6d9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-zxmkv_openstack-operators(97874450-ee94-4963-aa10-a58295edae62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:26 crc kubenswrapper[4760]: E1204 12:32:26.047175 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 04 12:32:26 crc kubenswrapper[4760]: E1204 12:32:26.047375 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7lhgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-qgdz6_openstack-operators(2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:33 crc kubenswrapper[4760]: I1204 12:32:33.380975 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:32:33 crc kubenswrapper[4760]: I1204 12:32:33.381539 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:32:35 crc kubenswrapper[4760]: E1204 12:32:35.796105 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 04 12:32:35 crc kubenswrapper[4760]: E1204 12:32:35.796908 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zjdkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-kzkts_openstack-operators(69f6297b-8cdc-4bfc-ba61-1868e7805998): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:36 crc kubenswrapper[4760]: E1204 12:32:36.287468 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 04 12:32:36 crc kubenswrapper[4760]: E1204 12:32:36.287854 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2tmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-wwqps_openstack-operators(7c8092df-4a88-4a8c-a400-6435f525a5ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:37 crc kubenswrapper[4760]: E1204 12:32:37.456759 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 04 12:32:37 crc kubenswrapper[4760]: E1204 12:32:37.457544 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqrwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-42kkv_openstack-operators(9093443e-058e-41f0-81ea-9ff8ba566d8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:38 crc kubenswrapper[4760]: E1204 12:32:38.415270 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 04 12:32:38 crc kubenswrapper[4760]: E1204 12:32:38.415615 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zc8w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-m9phl_openstack-operators(546bf075-78c1-4324-ba9a-80ac8df0c4f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:39 crc kubenswrapper[4760]: E1204 12:32:39.077615 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 04 12:32:39 crc kubenswrapper[4760]: E1204 12:32:39.078006 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7pjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-4866m_openstack-operators(35ad36be-f7a9-4ca8-bd29-0d5ccd658c53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:32:40 crc kubenswrapper[4760]: E1204 12:32:40.222870 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37\": context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 04 12:32:40 crc kubenswrapper[4760]: E1204 12:32:40.223198 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xl22k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pptq9_openstack-operators(35d340c4-abab-4dc8-8ba4-e8740d6b89d4): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37: Get \"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37\": context canceled" logger="UnhandledError" Dec 04 12:32:40 crc kubenswrapper[4760]: E1204 12:32:40.224971 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37: Get \\\"https://quay.io/v2/openstack-k8s-operators/rabbitmq-cluster-operator/blobs/sha256:707056f079b795fdeecba627f6fc856d1c154aaa3e9c4978a7d27a54da29bb37\\\": context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podUID="35d340c4-abab-4dc8-8ba4-e8740d6b89d4" Dec 04 12:32:43 crc kubenswrapper[4760]: I1204 12:32:43.549785 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7"] Dec 04 12:32:43 crc kubenswrapper[4760]: I1204 12:32:43.563341 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v"] Dec 04 12:32:43 crc kubenswrapper[4760]: I1204 12:32:43.626661 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj"] Dec 04 12:32:43 crc kubenswrapper[4760]: W1204 12:32:43.728404 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4473bb_6ed2_4f7c_96f0_e2998ff6fbc9.slice/crio-a5fb27380c3670855f787b09f7d3fcb5f38f221dfa96a5c8d12cc27572d7fb20 WatchSource:0}: Error finding container a5fb27380c3670855f787b09f7d3fcb5f38f221dfa96a5c8d12cc27572d7fb20: Status 404 returned error can't find the container with id a5fb27380c3670855f787b09f7d3fcb5f38f221dfa96a5c8d12cc27572d7fb20 Dec 04 12:32:44 crc kubenswrapper[4760]: I1204 12:32:44.024672 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" event={"ID":"bfa893f7-8101-4fd1-ae93-94688b827e95","Type":"ContainerStarted","Data":"9e95a71b1b27198d8935277fbddd9a6cdb88b0216ef8d39bb3e47b4be8548721"} Dec 04 12:32:44 crc kubenswrapper[4760]: I1204 12:32:44.036522 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" event={"ID":"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9","Type":"ContainerStarted","Data":"a5fb27380c3670855f787b09f7d3fcb5f38f221dfa96a5c8d12cc27572d7fb20"} Dec 04 12:32:44 crc kubenswrapper[4760]: I1204 12:32:44.039178 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" event={"ID":"73caa66c-d120-4b70-b417-d7f363ce6236","Type":"ContainerStarted","Data":"aa47002d4905a36877f21e45f6e44da900527a634514686fd34e866125f66d5b"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.053059 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" event={"ID":"abc24b8c-0be3-44c7-b011-5ea10803fdf1","Type":"ContainerStarted","Data":"406229d269d4e06e41cc76a742f23653e06f6cd4b5e9f30140dfe232d671a962"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.056146 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" event={"ID":"bfa893f7-8101-4fd1-ae93-94688b827e95","Type":"ContainerStarted","Data":"181180737a3608fed8d1a26e0549f7896c2af1ea875978fab24dd4b73ecdc07f"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.056251 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.058090 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" event={"ID":"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39","Type":"ContainerStarted","Data":"28c60f160593885d3f6e3c6580861e38bfbb1004de67a94cfff22d6f3518c31b"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.060239 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" event={"ID":"21a55b1d-ebff-4abd-a556-d272a1753a5b","Type":"ContainerStarted","Data":"37872d482b3427d565fa72f87d6474d1671f7dfa6e1baf7146fc322b25a88d35"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.061554 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" event={"ID":"282a54d8-5318-49e0-aefe-a86a7a8d63ac","Type":"ContainerStarted","Data":"7e979ecc98d1747892624b58a789af51aa298fa093825a0ba98caae516a87d33"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.062617 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" event={"ID":"4018757f-a398-4734-9a4e-b6cc11327b9f","Type":"ContainerStarted","Data":"668fb4a83d3871a95076f813d70affb84d66238a8df2e4cb7aa56ec9a781cefd"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.064901 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" event={"ID":"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a","Type":"ContainerStarted","Data":"29c14b90fd63a54aa7b3e218da85d7013a6f658c8f9d381c7b19e259b340d882"} Dec 04 12:32:45 crc kubenswrapper[4760]: I1204 12:32:45.171397 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" podStartSLOduration=39.171371741 podStartE2EDuration="39.171371741s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:32:45.170778312 +0000 UTC m=+1168.212224889" watchObservedRunningTime="2025-12-04 12:32:45.171371741 +0000 UTC m=+1168.212818308" Dec 04 12:32:46 crc kubenswrapper[4760]: I1204 12:32:46.100611 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" event={"ID":"fcf368e1-183d-445d-b3b7-dfd4f08fddcd","Type":"ContainerStarted","Data":"f5a58a71d473c4b31de1c49111b1d43ac58539164403af3016a58c0c92adab76"} Dec 04 12:32:50 crc kubenswrapper[4760]: I1204 12:32:50.155933 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" event={"ID":"a96cd5b4-6668-4815-b121-777fe0e65833","Type":"ContainerStarted","Data":"0d9a7e9d55d0954efd4784cea2f9317d9d6e53e1f807f854bf7f5f8ef1ab7827"} Dec 04 12:32:51 crc kubenswrapper[4760]: I1204 12:32:51.165452 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" event={"ID":"a01a242a-291b-4281-a331-91c05efcdf87","Type":"ContainerStarted","Data":"fd82f6eea969cfcdc8ee0c1417ef4ecfb3ed27c66b6801b1ee8af94523bf31ba"} Dec 04 12:32:51 crc kubenswrapper[4760]: E1204 12:32:51.915045 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podUID="35d340c4-abab-4dc8-8ba4-e8740d6b89d4" Dec 04 12:32:52 crc kubenswrapper[4760]: E1204 12:32:52.801782 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" podUID="b7918a8b-7f47-4d71-820b-95156b273357" Dec 04 12:32:52 crc kubenswrapper[4760]: E1204 12:32:52.835309 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" podUID="2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd" Dec 04 12:32:53 crc kubenswrapper[4760]: I1204 12:32:53.258807 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" event={"ID":"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd","Type":"ContainerStarted","Data":"a735410a4334e9aad0c4f6b1ca65b1ec3b177a4748f654f154a751574d1d5dd0"} Dec 04 12:32:53 crc kubenswrapper[4760]: I1204 12:32:53.282775 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" event={"ID":"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9","Type":"ContainerStarted","Data":"2e37f32e03fabe441f533f92f35096fa71e90134e730f75e90bf1c9a94d64f60"} Dec 04 12:32:53 crc kubenswrapper[4760]: I1204 12:32:53.296134 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" event={"ID":"b7918a8b-7f47-4d71-820b-95156b273357","Type":"ContainerStarted","Data":"55f2bb2eadac13a90470d5b1ef707e0a05b2c1498a0499da174c4b99d7d8fcd1"} Dec 04 12:32:53 crc kubenswrapper[4760]: I1204 12:32:53.324719 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" event={"ID":"73caa66c-d120-4b70-b417-d7f363ce6236","Type":"ContainerStarted","Data":"5db9de56f7410c2c0ddf7bc3c910f610eec86c10be23a8315968ac4b58786728"} Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.333531 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" podUID="97874450-ee94-4963-aa10-a58295edae62" Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.439837 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" podUID="546bf075-78c1-4324-ba9a-80ac8df0c4f7" Dec 04 12:32:53 crc kubenswrapper[4760]: I1204 12:32:53.442779 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-77fb648ff9-cnn8v" Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.450096 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" podUID="69f6297b-8cdc-4bfc-ba61-1868e7805998" Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.464785 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podUID="7c8092df-4a88-4a8c-a400-6435f525a5ec" Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.942469 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" podUID="9093443e-058e-41f0-81ea-9ff8ba566d8a" Dec 04 12:32:53 crc kubenswrapper[4760]: E1204 12:32:53.996300 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" podUID="750e95e1-6019-4779-a2c0-4abcce4b1c8c" Dec 04 12:32:54 crc kubenswrapper[4760]: E1204 12:32:54.065725 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" podUID="f13ef420-321e-40f8-90d2-e6fdcbb72752" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.334791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" event={"ID":"e8a9a9f4-8e40-4506-9aeb-c3e83d62de39","Type":"ContainerStarted","Data":"36606e842e3e66da41f0b0afd598d5297eb9ae1418e8ad9f791d78d8495a881d"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.334868 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.336249 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" event={"ID":"9093443e-058e-41f0-81ea-9ff8ba566d8a","Type":"ContainerStarted","Data":"5c193910c192d20f63cfef7be8d7a3b6c9386b5cad8673fd79f20cc1e9afc4a9"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.337707 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" event={"ID":"ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a","Type":"ContainerStarted","Data":"01dad5cc49eff21ad9915db2397e8ddb788820934f6fd415d2dde82220808e8c"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.337868 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.338124 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.339237 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" event={"ID":"fcf368e1-183d-445d-b3b7-dfd4f08fddcd","Type":"ContainerStarted","Data":"e39d2f9e6615ca31049a0d29b2941777e818d1b6f13fda39f16a4f6a34e51b4b"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.339417 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.340897 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.342326 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" event={"ID":"282a54d8-5318-49e0-aefe-a86a7a8d63ac","Type":"ContainerStarted","Data":"80b6531128a41bb6bfd1f832db7e9d4960e0a970288f43cccc0b375c9081b966"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.342476 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.342694 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.343967 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" event={"ID":"546bf075-78c1-4324-ba9a-80ac8df0c4f7","Type":"ContainerStarted","Data":"635bceac978ce412d2453f0a96a7d7c804c9cec85f7acf6f94f6735f809cfe85"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.344613 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.345469 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" event={"ID":"750e95e1-6019-4779-a2c0-4abcce4b1c8c","Type":"ContainerStarted","Data":"4c0d52b25e33a3acb43adb958d15791e73c9485e049ef4fd552b1559b9b493c6"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.347469 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" event={"ID":"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53","Type":"ContainerStarted","Data":"2744d76bf0e536d23474ef7d6f12781f76e01a1e47516e1271749c886004c8cf"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.349155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" event={"ID":"4018757f-a398-4734-9a4e-b6cc11327b9f","Type":"ContainerStarted","Data":"ffa636d3b869d9455d4c12a052d97a82d495ca107f4938687da2e6acb9f3b7b4"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.349411 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.350461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" event={"ID":"97874450-ee94-4963-aa10-a58295edae62","Type":"ContainerStarted","Data":"8b90e8e118176884e16cc986fcf026b09c6207c5feedb8daafb829e60908d458"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.351479 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.352760 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" event={"ID":"abc24b8c-0be3-44c7-b011-5ea10803fdf1","Type":"ContainerStarted","Data":"35499b2ad71cceb57ba3e9293c58110e1a26156df1a9838326c21c3ba3ebfa80"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.352954 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.354596 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" event={"ID":"f13ef420-321e-40f8-90d2-e6fdcbb72752","Type":"ContainerStarted","Data":"f68c09e941d35e14accaf1c18b4c936a1dd10442eea7c5cf99ef3133228e9207"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.355316 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.357258 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" event={"ID":"69f6297b-8cdc-4bfc-ba61-1868e7805998","Type":"ContainerStarted","Data":"0700ae48a5ccaaeae9dc957b6e802901e0ae4d462edb6d29c0f7dc1d14b1d69e"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.359041 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" event={"ID":"7c8092df-4a88-4a8c-a400-6435f525a5ec","Type":"ContainerStarted","Data":"3f3318137d75706706281139e9e2a92924db83b3e68f7aaa47402fbb90d4d12c"} Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.370031 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" podStartSLOduration=5.584421849 podStartE2EDuration="49.370002544s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.397740011 +0000 UTC m=+1131.439186578" lastFinishedPulling="2025-12-04 12:32:52.183320706 +0000 UTC m=+1175.224767273" observedRunningTime="2025-12-04 12:32:54.361785793 +0000 UTC m=+1177.403232360" watchObservedRunningTime="2025-12-04 12:32:54.370002544 +0000 UTC m=+1177.411449121" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.479441 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-thpz6" podStartSLOduration=5.943611562 podStartE2EDuration="49.479415047s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.854502543 +0000 UTC m=+1131.895949110" lastFinishedPulling="2025-12-04 12:32:52.390306028 +0000 UTC m=+1175.431752595" observedRunningTime="2025-12-04 12:32:54.438832269 +0000 UTC m=+1177.480278836" watchObservedRunningTime="2025-12-04 12:32:54.479415047 +0000 UTC m=+1177.520861614" Dec 04 12:32:54 crc kubenswrapper[4760]: E1204 12:32:54.510101 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" podUID="35ad36be-f7a9-4ca8-bd29-0d5ccd658c53" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.634797 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfqvw" podStartSLOduration=5.893327007 podStartE2EDuration="49.634771521s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.000688804 +0000 UTC m=+1132.042135371" lastFinishedPulling="2025-12-04 12:32:52.742133318 +0000 UTC m=+1175.783579885" observedRunningTime="2025-12-04 12:32:54.544880476 +0000 UTC m=+1177.586327043" watchObservedRunningTime="2025-12-04 12:32:54.634771521 +0000 UTC m=+1177.676218088" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.638575 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-q9nzq" podStartSLOduration=5.433185987 podStartE2EDuration="48.638543571s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.277514793 +0000 UTC m=+1132.318961360" lastFinishedPulling="2025-12-04 12:32:52.482872377 +0000 UTC m=+1175.524318944" observedRunningTime="2025-12-04 12:32:54.593900303 +0000 UTC m=+1177.635346870" watchObservedRunningTime="2025-12-04 12:32:54.638543571 +0000 UTC m=+1177.679990148" Dec 04 12:32:54 crc kubenswrapper[4760]: E1204 12:32:54.892343 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podUID="7c8092df-4a88-4a8c-a400-6435f525a5ec" Dec 04 12:32:54 crc kubenswrapper[4760]: I1204 12:32:54.903161 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x5zgc" podStartSLOduration=6.831334498 podStartE2EDuration="49.903138691s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.258950924 +0000 UTC m=+1132.300397491" lastFinishedPulling="2025-12-04 12:32:52.330755117 +0000 UTC m=+1175.372201684" observedRunningTime="2025-12-04 12:32:54.8920573 +0000 UTC m=+1177.933503887" watchObservedRunningTime="2025-12-04 12:32:54.903138691 +0000 UTC m=+1177.944585258" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.039554 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-5jtgd" podStartSLOduration=6.150022856 podStartE2EDuration="50.039536832s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.421455463 +0000 UTC m=+1131.462902030" lastFinishedPulling="2025-12-04 12:32:52.310969439 +0000 UTC m=+1175.352416006" observedRunningTime="2025-12-04 12:32:55.032477908 +0000 UTC m=+1178.073924505" watchObservedRunningTime="2025-12-04 12:32:55.039536832 +0000 UTC m=+1178.080983399" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.388983 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" event={"ID":"a96cd5b4-6668-4815-b121-777fe0e65833","Type":"ContainerStarted","Data":"3e3417345b283d3d7e35593feebc1efb9a5474d6af722cf9a5f9267a0c6dc8ce"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.389360 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.393642 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.398915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" event={"ID":"21a55b1d-ebff-4abd-a556-d272a1753a5b","Type":"ContainerStarted","Data":"1b4806919cddecada332c420a73f424b6e821cc93e1597fa6aa367ff0fdab910"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.400027 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.404930 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.407008 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" event={"ID":"ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9","Type":"ContainerStarted","Data":"8ffd61f66d2c868685ca6618dffa3fe62dff849a2dafe92afaf3a861701c9bc0"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.407344 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.420191 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" event={"ID":"b7918a8b-7f47-4d71-820b-95156b273357","Type":"ContainerStarted","Data":"e85ad9d9418e1ebfbcc702e1c95a1b4d33f7809dac145372768f2649f50f67c7"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.421145 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.429467 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" event={"ID":"a01a242a-291b-4281-a331-91c05efcdf87","Type":"ContainerStarted","Data":"d442969687ebba210760f418b66dab83be7672290a653f5966694b2935b9d753"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.430481 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.432776 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.443378 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" event={"ID":"73caa66c-d120-4b70-b417-d7f363ce6236","Type":"ContainerStarted","Data":"4b0d468b5bac89ef29f45af76864048da3355441330d54d43f6e52790251ab14"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.444400 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.464520 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ss2gd" podStartSLOduration=6.145321278 podStartE2EDuration="49.464492885s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.235134378 +0000 UTC m=+1132.276580955" lastFinishedPulling="2025-12-04 12:32:52.554305985 +0000 UTC m=+1175.595752562" observedRunningTime="2025-12-04 12:32:55.449366874 +0000 UTC m=+1178.490813441" watchObservedRunningTime="2025-12-04 12:32:55.464492885 +0000 UTC m=+1178.505939452" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.465202 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-twx55" podStartSLOduration=6.025988268 podStartE2EDuration="49.465194377s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.480515149 +0000 UTC m=+1132.521961716" lastFinishedPulling="2025-12-04 12:32:52.919721248 +0000 UTC m=+1175.961167825" observedRunningTime="2025-12-04 12:32:55.418331649 +0000 UTC m=+1178.459778216" watchObservedRunningTime="2025-12-04 12:32:55.465194377 +0000 UTC m=+1178.506640944" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.467623 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" event={"ID":"2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd","Type":"ContainerStarted","Data":"3238ace31f6e3a8b34e620133eb935afa6edf25e48aba97744a4038506cc08d2"} Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.590939 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" podStartSLOduration=41.342753202 podStartE2EDuration="49.590899537s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:43.734579112 +0000 UTC m=+1166.776025679" lastFinishedPulling="2025-12-04 12:32:51.982725457 +0000 UTC m=+1175.024172014" observedRunningTime="2025-12-04 12:32:55.580738865 +0000 UTC m=+1178.622185442" watchObservedRunningTime="2025-12-04 12:32:55.590899537 +0000 UTC m=+1178.632346114" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.692038 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" podStartSLOduration=42.444781093 podStartE2EDuration="50.692005128s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:43.735485291 +0000 UTC m=+1166.776931858" lastFinishedPulling="2025-12-04 12:32:51.982709326 +0000 UTC m=+1175.024155893" observedRunningTime="2025-12-04 12:32:55.677867239 +0000 UTC m=+1178.719313816" watchObservedRunningTime="2025-12-04 12:32:55.692005128 +0000 UTC m=+1178.733451725" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.744790 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" podStartSLOduration=4.611560531 podStartE2EDuration="50.744722163s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.893564683 +0000 UTC m=+1131.935011250" lastFinishedPulling="2025-12-04 12:32:55.026726315 +0000 UTC m=+1178.068172882" observedRunningTime="2025-12-04 12:32:55.737511703 +0000 UTC m=+1178.778958280" watchObservedRunningTime="2025-12-04 12:32:55.744722163 +0000 UTC m=+1178.786168730" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.816924 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" podStartSLOduration=4.186733972 podStartE2EDuration="50.816901024s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.398122713 +0000 UTC m=+1131.439569280" lastFinishedPulling="2025-12-04 12:32:55.028289765 +0000 UTC m=+1178.069736332" observedRunningTime="2025-12-04 12:32:55.772543046 +0000 UTC m=+1178.813989623" watchObservedRunningTime="2025-12-04 12:32:55.816901024 +0000 UTC m=+1178.858347591" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.846269 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-57pzb" podStartSLOduration=6.644086452 podStartE2EDuration="49.846241985s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.286623312 +0000 UTC m=+1132.328069879" lastFinishedPulling="2025-12-04 12:32:52.488778845 +0000 UTC m=+1175.530225412" observedRunningTime="2025-12-04 12:32:55.818880007 +0000 UTC m=+1178.860326594" watchObservedRunningTime="2025-12-04 12:32:55.846241985 +0000 UTC m=+1178.887688562" Dec 04 12:32:55 crc kubenswrapper[4760]: I1204 12:32:55.925777 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.476310 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" event={"ID":"f13ef420-321e-40f8-90d2-e6fdcbb72752","Type":"ContainerStarted","Data":"13dc1179806fb1a9f5ebe8025b57f16aa64978c650dd50dee3b6efaf1edf9a50"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.476459 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.478468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" event={"ID":"35ad36be-f7a9-4ca8-bd29-0d5ccd658c53","Type":"ContainerStarted","Data":"beba239bb5bfc48488e5ea6fee07cb2cd15495988e2f96eaa48687a3bc10ca26"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.479257 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.482651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" event={"ID":"9093443e-058e-41f0-81ea-9ff8ba566d8a","Type":"ContainerStarted","Data":"1e670162ea22ac6543123d93582293155a7c69bc12ebb4993ce20d4d5c89dd27"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.483129 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.485765 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" event={"ID":"69f6297b-8cdc-4bfc-ba61-1868e7805998","Type":"ContainerStarted","Data":"400a6e2a0d4cd9d1326acd2d0a7f835a0b21b6ebcc0532fccbafeb8a2de93d25"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.485847 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.488346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" event={"ID":"97874450-ee94-4963-aa10-a58295edae62","Type":"ContainerStarted","Data":"8570e07fc74a0d40e51ff2d855331f2e0174f7ae5945bfc21034156e837c0d6b"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.488507 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.490888 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" event={"ID":"546bf075-78c1-4324-ba9a-80ac8df0c4f7","Type":"ContainerStarted","Data":"c4814f56a7325761b96e39c00d51c4e6f83148e48ecd59753651f845fecb41ca"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.491773 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.495061 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" event={"ID":"750e95e1-6019-4779-a2c0-4abcce4b1c8c","Type":"ContainerStarted","Data":"7792014b584a80eed674a65a3354c5f28494ee20d26c3396aff895299cd108e8"} Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.495100 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.502411 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" podStartSLOduration=4.148368954 podStartE2EDuration="50.502391419s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.129093121 +0000 UTC m=+1132.170539688" lastFinishedPulling="2025-12-04 12:32:55.483115586 +0000 UTC m=+1178.524562153" observedRunningTime="2025-12-04 12:32:56.500470098 +0000 UTC m=+1179.541916675" watchObservedRunningTime="2025-12-04 12:32:56.502391419 +0000 UTC m=+1179.543837996" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.553466 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" podStartSLOduration=4.23172156 podStartE2EDuration="50.55344831s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.395724706 +0000 UTC m=+1132.437171283" lastFinishedPulling="2025-12-04 12:32:55.717451466 +0000 UTC m=+1178.758898033" observedRunningTime="2025-12-04 12:32:56.54654719 +0000 UTC m=+1179.587993757" watchObservedRunningTime="2025-12-04 12:32:56.55344831 +0000 UTC m=+1179.594894877" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.663679 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" podStartSLOduration=4.310736638 podStartE2EDuration="51.663656988s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.353921569 +0000 UTC m=+1131.395368136" lastFinishedPulling="2025-12-04 12:32:55.706841919 +0000 UTC m=+1178.748288486" observedRunningTime="2025-12-04 12:32:56.663543985 +0000 UTC m=+1179.704990552" watchObservedRunningTime="2025-12-04 12:32:56.663656988 +0000 UTC m=+1179.705103555" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.665977 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" podStartSLOduration=4.596165611 podStartE2EDuration="50.665968592s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.238617538 +0000 UTC m=+1132.280064105" lastFinishedPulling="2025-12-04 12:32:55.308420519 +0000 UTC m=+1178.349867086" observedRunningTime="2025-12-04 12:32:56.636659802 +0000 UTC m=+1179.678106379" watchObservedRunningTime="2025-12-04 12:32:56.665968592 +0000 UTC m=+1179.707415159" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.705976 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" podStartSLOduration=4.760596341 podStartE2EDuration="51.705955441s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.769259136 +0000 UTC m=+1131.810705703" lastFinishedPulling="2025-12-04 12:32:55.714618236 +0000 UTC m=+1178.756064803" observedRunningTime="2025-12-04 12:32:56.702197772 +0000 UTC m=+1179.743644369" watchObservedRunningTime="2025-12-04 12:32:56.705955441 +0000 UTC m=+1179.747402008" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.768739 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" podStartSLOduration=4.1157885 podStartE2EDuration="51.768719045s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:08.378765208 +0000 UTC m=+1131.420211775" lastFinishedPulling="2025-12-04 12:32:56.031695753 +0000 UTC m=+1179.073142320" observedRunningTime="2025-12-04 12:32:56.761697521 +0000 UTC m=+1179.803144108" watchObservedRunningTime="2025-12-04 12:32:56.768719045 +0000 UTC m=+1179.810165612" Dec 04 12:32:56 crc kubenswrapper[4760]: I1204 12:32:56.797700 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" podStartSLOduration=5.391965438 podStartE2EDuration="51.797673603s" podCreationTimestamp="2025-12-04 12:32:05 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.085676223 +0000 UTC m=+1132.127122790" lastFinishedPulling="2025-12-04 12:32:55.491384378 +0000 UTC m=+1178.532830955" observedRunningTime="2025-12-04 12:32:56.791584911 +0000 UTC m=+1179.833031478" watchObservedRunningTime="2025-12-04 12:32:56.797673603 +0000 UTC m=+1179.839120170" Dec 04 12:32:57 crc kubenswrapper[4760]: I1204 12:32:57.521423 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-2ffsj" Dec 04 12:33:02 crc kubenswrapper[4760]: I1204 12:33:02.640619 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7" Dec 04 12:33:03 crc kubenswrapper[4760]: I1204 12:33:03.380964 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:33:03 crc kubenswrapper[4760]: I1204 12:33:03.381046 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:33:05 crc kubenswrapper[4760]: I1204 12:33:05.897078 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" Dec 04 12:33:05 crc kubenswrapper[4760]: I1204 12:33:05.927482 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-kzkts" Dec 04 12:33:05 crc kubenswrapper[4760]: I1204 12:33:05.983988 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-42kkv" Dec 04 12:33:06 crc kubenswrapper[4760]: I1204 12:33:06.405444 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-g64p8" Dec 04 12:33:06 crc kubenswrapper[4760]: I1204 12:33:06.418260 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-4866m" Dec 04 12:33:06 crc kubenswrapper[4760]: I1204 12:33:06.533860 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-m9phl" Dec 04 12:33:07 crc kubenswrapper[4760]: I1204 12:33:07.442330 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-slr7f" Dec 04 12:33:07 crc kubenswrapper[4760]: I1204 12:33:07.450405 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-zxmkv" Dec 04 12:33:07 crc kubenswrapper[4760]: I1204 12:33:07.453377 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-g22dq" Dec 04 12:33:08 crc kubenswrapper[4760]: I1204 12:33:08.914665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" event={"ID":"7c8092df-4a88-4a8c-a400-6435f525a5ec","Type":"ContainerStarted","Data":"0cd6db410f2ef813d11b87d583c98e2d4670a147cc8099e1d1eb2975100a14e9"} Dec 04 12:33:08 crc kubenswrapper[4760]: I1204 12:33:08.915174 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:33:08 crc kubenswrapper[4760]: I1204 12:33:08.917622 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" event={"ID":"35d340c4-abab-4dc8-8ba4-e8740d6b89d4","Type":"ContainerStarted","Data":"4f66c2603266c014f97f1303ab8043c9680bf76e089b61b9d22b131951605b69"} Dec 04 12:33:08 crc kubenswrapper[4760]: I1204 12:33:08.939034 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" podStartSLOduration=3.843807364 podStartE2EDuration="1m2.939011817s" podCreationTimestamp="2025-12-04 12:32:06 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.259864303 +0000 UTC m=+1132.301310870" lastFinishedPulling="2025-12-04 12:33:08.355068756 +0000 UTC m=+1191.396515323" observedRunningTime="2025-12-04 12:33:08.931726885 +0000 UTC m=+1191.973173452" watchObservedRunningTime="2025-12-04 12:33:08.939011817 +0000 UTC m=+1191.980458384" Dec 04 12:33:08 crc kubenswrapper[4760]: I1204 12:33:08.947364 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pptq9" podStartSLOduration=3.250754925 podStartE2EDuration="1m1.947343761s" podCreationTimestamp="2025-12-04 12:32:07 +0000 UTC" firstStartedPulling="2025-12-04 12:32:09.461768604 +0000 UTC m=+1132.503215181" lastFinishedPulling="2025-12-04 12:33:08.15835745 +0000 UTC m=+1191.199804017" observedRunningTime="2025-12-04 12:33:08.945492762 +0000 UTC m=+1191.986939329" watchObservedRunningTime="2025-12-04 12:33:08.947343761 +0000 UTC m=+1191.988790328" Dec 04 12:33:17 crc kubenswrapper[4760]: I1204 12:33:17.449452 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wwqps" Dec 04 12:33:33 crc kubenswrapper[4760]: I1204 12:33:33.380867 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:33:33 crc kubenswrapper[4760]: I1204 12:33:33.381728 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:33:33 crc kubenswrapper[4760]: I1204 12:33:33.381793 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:33:33 crc kubenswrapper[4760]: I1204 12:33:33.382762 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:33:33 crc kubenswrapper[4760]: I1204 12:33:33.382833 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e" gracePeriod=600 Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.172860 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e" exitCode=0 Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.173609 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e"} Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.173687 4760 scope.go:117] "RemoveContainer" containerID="9dbcb718be2a7f2596059e1c2783a32fa9aefcba6858c3d8e8320ae2bdc7181a" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.338382 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.339995 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.345052 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hgplm" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.345363 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.351105 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.473426 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.478248 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.482875 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.497647 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.529711 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdj7d\" (UniqueName: \"kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.529810 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.529846 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.529912 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g48t\" (UniqueName: \"kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.529957 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.630877 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdj7d\" (UniqueName: \"kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.630923 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.630945 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.630993 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g48t\" (UniqueName: \"kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.631028 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.632052 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.632449 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.632549 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.653099 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g48t\" (UniqueName: \"kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t\") pod \"dnsmasq-dns-78dd6ddcc-sf9zs\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.680176 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdj7d\" (UniqueName: \"kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d\") pod \"dnsmasq-dns-675f4bcbfc-5t7bh\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.827630 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:33:34 crc kubenswrapper[4760]: I1204 12:33:34.961257 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.196330 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf"} Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.364494 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.376657 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.437067 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.439288 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.451882 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkf27\" (UniqueName: \"kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.452220 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.452423 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.455433 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.554451 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkf27\" (UniqueName: \"kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.554552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.554615 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.557359 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.557395 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.591265 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkf27\" (UniqueName: \"kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27\") pod \"dnsmasq-dns-666b6646f7-hjm6n\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.768483 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:33:35 crc kubenswrapper[4760]: I1204 12:33:35.971793 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.154885 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.201817 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.204357 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.232984 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" event={"ID":"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0","Type":"ContainerStarted","Data":"685fb48ba0a8ab50777ec419cad6a13478ca68c6cf6cb6f219e296ec313cf13a"} Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.250666 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.254349 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" event={"ID":"99f9835b-e77b-4032-9238-b41836e0c480","Type":"ContainerStarted","Data":"349a1456a83d1433c2af601f805edef205f4c045577b7ee74a2d1f2c3b231e7e"} Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.281155 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.281255 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.281337 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xf2\" (UniqueName: \"kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.360431 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.388260 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.388331 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xf2\" (UniqueName: \"kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.388404 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.389394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.389868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.451474 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xf2\" (UniqueName: \"kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2\") pod \"dnsmasq-dns-57d769cc4f-zpq26\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.589107 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.658302 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.721066 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.725155 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-h6ktn" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.737110 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.740305 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.741089 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.742325 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.742418 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.742859 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.745044 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809590 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809655 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809687 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809720 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809759 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809807 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809835 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809860 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tct7s\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809885 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.809946 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.911717 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.911796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.911825 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.911924 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.911988 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912050 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912080 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912133 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tct7s\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912161 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912195 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.912390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.913758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.914962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.922728 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.923186 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.935328 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.935650 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.936357 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.938447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.941811 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.970185 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:36 crc kubenswrapper[4760]: I1204 12:33:36.997868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tct7s\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.020958 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " pod="openstack/rabbitmq-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.067005 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.304494 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.311784 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" event={"ID":"34e48143-4c17-45c3-ba03-f9d4a1d054be","Type":"ContainerStarted","Data":"1712061a2544d72050af96cd7daf23f7c6c7e49c5dca0bcd5523e04a39ff5624"} Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.330395 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.334105 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.343597 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dbftv" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.343859 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.344069 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.344218 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.344444 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.344616 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.349901 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.358364 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444380 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444487 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444528 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444551 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7pk\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444577 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444613 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444653 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444679 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444744 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444773 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.444798 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648188 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648604 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648632 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648668 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648717 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648741 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648759 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7pk\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648816 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648853 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.648907 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.651550 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.667517 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.673250 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.674099 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.680279 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.680708 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.682456 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.687243 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.688516 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.696268 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.696355 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7pk\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.725091 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:37 crc kubenswrapper[4760]: I1204 12:33:37.997837 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.095484 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:33:38 crc kubenswrapper[4760]: W1204 12:33:38.152391 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6477a59_2dc3_4fff_907e_7e927cf257d3.slice/crio-3c3057d062e84cc3725c3c925330cfab62d4853e7a1ee840efbbb3e84b1f8e11 WatchSource:0}: Error finding container 3c3057d062e84cc3725c3c925330cfab62d4853e7a1ee840efbbb3e84b1f8e11: Status 404 returned error can't find the container with id 3c3057d062e84cc3725c3c925330cfab62d4853e7a1ee840efbbb3e84b1f8e11 Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.339002 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerStarted","Data":"3c3057d062e84cc3725c3c925330cfab62d4853e7a1ee840efbbb3e84b1f8e11"} Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.345658 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" event={"ID":"cf889db1-7b40-45a5-b19b-f0992b77b406","Type":"ContainerStarted","Data":"11547ad189443acb2bb63d01ca97d91608e2c0f53574533c5ece517a6cef455d"} Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.798368 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.801634 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.812140 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.812490 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wbhpx" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.812640 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.812770 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.815979 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.819687 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903258 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903420 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsptw\" (UniqueName: \"kubernetes.io/projected/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kube-api-access-vsptw\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903468 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903594 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903622 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903641 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:38 crc kubenswrapper[4760]: I1204 12:33:38.903679 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.006994 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007083 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsptw\" (UniqueName: \"kubernetes.io/projected/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kube-api-access-vsptw\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007500 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007523 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007573 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007615 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.007697 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.008097 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.012576 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.012723 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.013482 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9649c38c-ebc4-4103-aa55-c2aa867d6e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.026418 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9649c38c-ebc4-4103-aa55-c2aa867d6e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.052015 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.052364 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9649c38c-ebc4-4103-aa55-c2aa867d6e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.057568 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsptw\" (UniqueName: \"kubernetes.io/projected/9649c38c-ebc4-4103-aa55-c2aa867d6e26-kube-api-access-vsptw\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.120751 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"9649c38c-ebc4-4103-aa55-c2aa867d6e26\") " pod="openstack/openstack-galera-0" Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.130011 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:33:39 crc kubenswrapper[4760]: W1204 12:33:39.305684 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ceba36_f8bc_4644_978c_08a4cbf88ae5.slice/crio-31de9a10947fd0eaa552656a9f4718ca456b088f24a766dc1ed1fc31c5baa428 WatchSource:0}: Error finding container 31de9a10947fd0eaa552656a9f4718ca456b088f24a766dc1ed1fc31c5baa428: Status 404 returned error can't find the container with id 31de9a10947fd0eaa552656a9f4718ca456b088f24a766dc1ed1fc31c5baa428 Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.405271 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerStarted","Data":"31de9a10947fd0eaa552656a9f4718ca456b088f24a766dc1ed1fc31c5baa428"} Dec 04 12:33:39 crc kubenswrapper[4760]: I1204 12:33:39.460644 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.332296 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.335537 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.344391 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cn98r" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.344705 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.344848 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.345096 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.386055 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.459410 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 12:33:40 crc kubenswrapper[4760]: W1204 12:33:40.483950 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9649c38c_ebc4_4103_aa55_c2aa867d6e26.slice/crio-dc4e8ab40533de85575d0103e9c842046fe0582192118f0f4a184acdcb0d5288 WatchSource:0}: Error finding container dc4e8ab40533de85575d0103e9c842046fe0582192118f0f4a184acdcb0d5288: Status 404 returned error can't find the container with id dc4e8ab40533de85575d0103e9c842046fe0582192118f0f4a184acdcb0d5288 Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.501962 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502055 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502150 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502188 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502268 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502298 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502344 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.502402 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvw2l\" (UniqueName: \"kubernetes.io/projected/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kube-api-access-cvw2l\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.603812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.609844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.609989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.610068 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.610454 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.610500 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.610689 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.610945 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvw2l\" (UniqueName: \"kubernetes.io/projected/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kube-api-access-cvw2l\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.605329 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.611988 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.613352 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.614570 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.614836 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.627762 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.634438 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.651633 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.653683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvw2l\" (UniqueName: \"kubernetes.io/projected/db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff-kube-api-access-cvw2l\") pod \"openstack-cell1-galera-0\" (UID: \"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff\") " pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.691934 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.872110 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 12:33:40 crc kubenswrapper[4760]: I1204 12:33:40.875060 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:40.890699 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:40.891202 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:40.891500 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mzcvq" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.260933 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.261121 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.261274 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-kolla-config\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.261392 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9gpw\" (UniqueName: \"kubernetes.io/projected/40657d96-3f5d-4da0-9783-845c41bfeaae-kube-api-access-f9gpw\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.261479 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-config-data\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.288601 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.364149 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.364295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.364370 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-kolla-config\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.364656 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9gpw\" (UniqueName: \"kubernetes.io/projected/40657d96-3f5d-4da0-9783-845c41bfeaae-kube-api-access-f9gpw\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.364688 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-config-data\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.367536 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-kolla-config\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.367936 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40657d96-3f5d-4da0-9783-845c41bfeaae-config-data\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.388978 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.389209 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40657d96-3f5d-4da0-9783-845c41bfeaae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.393125 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9gpw\" (UniqueName: \"kubernetes.io/projected/40657d96-3f5d-4da0-9783-845c41bfeaae-kube-api-access-f9gpw\") pod \"memcached-0\" (UID: \"40657d96-3f5d-4da0-9783-845c41bfeaae\") " pod="openstack/memcached-0" Dec 04 12:33:41 crc kubenswrapper[4760]: I1204 12:33:41.484514 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9649c38c-ebc4-4103-aa55-c2aa867d6e26","Type":"ContainerStarted","Data":"dc4e8ab40533de85575d0103e9c842046fe0582192118f0f4a184acdcb0d5288"} Dec 04 12:33:42 crc kubenswrapper[4760]: I1204 12:33:42.074854 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 12:33:43 crc kubenswrapper[4760]: I1204 12:33:43.211906 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 12:33:43 crc kubenswrapper[4760]: W1204 12:33:43.234125 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb4dbf6c_c9d0_4f9c_889d_a453bb2da6ff.slice/crio-473c5d8dc4c2f6a2395cc12f16837c2b46fb3c1fb06278bfa4170e0b51108acc WatchSource:0}: Error finding container 473c5d8dc4c2f6a2395cc12f16837c2b46fb3c1fb06278bfa4170e0b51108acc: Status 404 returned error can't find the container with id 473c5d8dc4c2f6a2395cc12f16837c2b46fb3c1fb06278bfa4170e0b51108acc Dec 04 12:33:43 crc kubenswrapper[4760]: I1204 12:33:43.695827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 12:33:44 crc kubenswrapper[4760]: I1204 12:33:44.076686 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff","Type":"ContainerStarted","Data":"473c5d8dc4c2f6a2395cc12f16837c2b46fb3c1fb06278bfa4170e0b51108acc"} Dec 04 12:33:44 crc kubenswrapper[4760]: I1204 12:33:44.094373 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"40657d96-3f5d-4da0-9783-845c41bfeaae","Type":"ContainerStarted","Data":"cb3854ed09a7209638af4c7856d41ed6f8c86e4d7e27d97c8eb6b4a77c68686a"} Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.443274 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.449419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.455833 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9j6nj" Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.469470 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.531251 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh66t\" (UniqueName: \"kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t\") pod \"kube-state-metrics-0\" (UID: \"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8\") " pod="openstack/kube-state-metrics-0" Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.633579 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh66t\" (UniqueName: \"kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t\") pod \"kube-state-metrics-0\" (UID: \"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8\") " pod="openstack/kube-state-metrics-0" Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.668627 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh66t\" (UniqueName: \"kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t\") pod \"kube-state-metrics-0\" (UID: \"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8\") " pod="openstack/kube-state-metrics-0" Dec 04 12:33:45 crc kubenswrapper[4760]: I1204 12:33:45.789496 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.159616 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-frcmm"] Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.170655 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.184739 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.186141 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vw8v8" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.186632 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.242503 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-frcmm"] Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.268750 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.268854 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-ovn-controller-tls-certs\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.268992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.269039 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52334746-99b6-4056-a7d7-6df95b72d8de-scripts\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.269190 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4v96\" (UniqueName: \"kubernetes.io/projected/52334746-99b6-4056-a7d7-6df95b72d8de-kube-api-access-r4v96\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.269264 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-log-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.269350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-combined-ca-bundle\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.280640 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-76hxp"] Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.283943 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.337335 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-76hxp"] Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.798381 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qmn\" (UniqueName: \"kubernetes.io/projected/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-kube-api-access-r8qmn\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.798523 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.798674 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52334746-99b6-4056-a7d7-6df95b72d8de-scripts\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.798773 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4v96\" (UniqueName: \"kubernetes.io/projected/52334746-99b6-4056-a7d7-6df95b72d8de-kube-api-access-r4v96\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.798843 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-log-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.799405 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-lib\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.799442 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-etc-ovs\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.799761 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-run\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.799809 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-log\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.799864 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-scripts\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.800051 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-combined-ca-bundle\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.800140 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.800177 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-ovn-controller-tls-certs\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.820458 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/52334746-99b6-4056-a7d7-6df95b72d8de-scripts\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.843710 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.845115 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-run\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.845463 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/52334746-99b6-4056-a7d7-6df95b72d8de-var-log-ovn\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.846643 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-ovn-controller-tls-certs\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.936489 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qmn\" (UniqueName: \"kubernetes.io/projected/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-kube-api-access-r8qmn\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.936963 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-lib\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.936986 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-etc-ovs\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.937022 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-run\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.937042 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-log\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.937060 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-scripts\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.940584 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-lib\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.940694 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-etc-ovs\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.940894 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-run\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.940971 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-var-log\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:46 crc kubenswrapper[4760]: I1204 12:33:46.949060 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-scripts\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.023166 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.065073 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4v96\" (UniqueName: \"kubernetes.io/projected/52334746-99b6-4056-a7d7-6df95b72d8de-kube-api-access-r4v96\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.088250 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qmn\" (UniqueName: \"kubernetes.io/projected/7198d8b5-1a9e-45e7-8151-922d62c1e1f0-kube-api-access-r8qmn\") pod \"ovn-controller-ovs-76hxp\" (UID: \"7198d8b5-1a9e-45e7-8151-922d62c1e1f0\") " pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.091259 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.104570 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.104787 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.105071 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rk2t4" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.119457 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.119916 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.135621 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.235844 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254286 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254384 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254413 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254575 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-config\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254599 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254631 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw4g\" (UniqueName: \"kubernetes.io/projected/591b6997-8c15-499c-8218-e222a178559e-kube-api-access-7qw4g\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254649 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/591b6997-8c15-499c-8218-e222a178559e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.254676 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-config\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356499 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356528 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw4g\" (UniqueName: \"kubernetes.io/projected/591b6997-8c15-499c-8218-e222a178559e-kube-api-access-7qw4g\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356562 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/591b6997-8c15-499c-8218-e222a178559e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356591 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356661 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356691 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.356717 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.359298 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/591b6997-8c15-499c-8218-e222a178559e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.360181 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-config\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.360596 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.366150 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/591b6997-8c15-499c-8218-e222a178559e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.384225 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.386666 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.386828 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/591b6997-8c15-499c-8218-e222a178559e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.475882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52334746-99b6-4056-a7d7-6df95b72d8de-combined-ca-bundle\") pod \"ovn-controller-frcmm\" (UID: \"52334746-99b6-4056-a7d7-6df95b72d8de\") " pod="openstack/ovn-controller-frcmm" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.489194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw4g\" (UniqueName: \"kubernetes.io/projected/591b6997-8c15-499c-8218-e222a178559e-kube-api-access-7qw4g\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.539601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"591b6997-8c15-499c-8218-e222a178559e\") " pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.642110 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.738252 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm" Dec 04 12:33:47 crc kubenswrapper[4760]: W1204 12:33:47.775241 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd389a85e_f3e6_4ad5_b11d_9555d6aee3a8.slice/crio-9b3d45ba2a0d36b16cc87922423097a3e0ee16471538f49a7d0f63218e5d43d3 WatchSource:0}: Error finding container 9b3d45ba2a0d36b16cc87922423097a3e0ee16471538f49a7d0f63218e5d43d3: Status 404 returned error can't find the container with id 9b3d45ba2a0d36b16cc87922423097a3e0ee16471538f49a7d0f63218e5d43d3 Dec 04 12:33:47 crc kubenswrapper[4760]: I1204 12:33:47.802790 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 12:33:48 crc kubenswrapper[4760]: I1204 12:33:48.335477 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8","Type":"ContainerStarted","Data":"9b3d45ba2a0d36b16cc87922423097a3e0ee16471538f49a7d0f63218e5d43d3"} Dec 04 12:33:49 crc kubenswrapper[4760]: I1204 12:33:49.412177 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-76hxp"] Dec 04 12:33:49 crc kubenswrapper[4760]: I1204 12:33:49.423736 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-frcmm"] Dec 04 12:33:49 crc kubenswrapper[4760]: I1204 12:33:49.650673 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.175294 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xwlm9"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.180898 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.184648 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.217530 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovn-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.218053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4174c36-258e-4ed9-b6a7-f52818d3faed-config\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.218126 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b2nj\" (UniqueName: \"kubernetes.io/projected/a4174c36-258e-4ed9-b6a7-f52818d3faed-kube-api-access-4b2nj\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.218162 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovs-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.218353 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-combined-ca-bundle\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.218430 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.284743 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xwlm9"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332421 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-combined-ca-bundle\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332574 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovn-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332616 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4174c36-258e-4ed9-b6a7-f52818d3faed-config\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332659 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b2nj\" (UniqueName: \"kubernetes.io/projected/a4174c36-258e-4ed9-b6a7-f52818d3faed-kube-api-access-4b2nj\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.332697 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovs-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.333146 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovs-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.333203 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a4174c36-258e-4ed9-b6a7-f52818d3faed-ovn-rundir\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.334181 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4174c36-258e-4ed9-b6a7-f52818d3faed-config\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.353228 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.411947 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b2nj\" (UniqueName: \"kubernetes.io/projected/a4174c36-258e-4ed9-b6a7-f52818d3faed-kube-api-access-4b2nj\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.415266 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4174c36-258e-4ed9-b6a7-f52818d3faed-combined-ca-bundle\") pod \"ovn-controller-metrics-xwlm9\" (UID: \"a4174c36-258e-4ed9-b6a7-f52818d3faed\") " pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.522736 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xwlm9" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.527868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-frcmm" event={"ID":"52334746-99b6-4056-a7d7-6df95b72d8de","Type":"ContainerStarted","Data":"41c1cd77fec88ed9737383dbeaf046a1f0a2c5b4797e09f3a3a9932c4a3987ee"} Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.578197 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"591b6997-8c15-499c-8218-e222a178559e","Type":"ContainerStarted","Data":"95df61ac35ac8e3a4d0c653d1bb95ad8ec2fb59c9bb73d6f151d37e3268531e8"} Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.617896 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-76hxp" event={"ID":"7198d8b5-1a9e-45e7-8151-922d62c1e1f0","Type":"ContainerStarted","Data":"a33b5ae5d2d2dc45d448ddd8a56a7c33e2957b66b902160539eeacf79f9227c4"} Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.619162 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.699290 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.701416 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.886938 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.886999 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.887039 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.887144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfhq\" (UniqueName: \"kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.890621 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.987600 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.990811 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.990927 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.990993 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.991664 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfhq\" (UniqueName: \"kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.992817 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.992859 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:50 crc kubenswrapper[4760]: I1204 12:33:50.992977 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:51 crc kubenswrapper[4760]: I1204 12:33:51.082375 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfhq\" (UniqueName: \"kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq\") pod \"dnsmasq-dns-7fd796d7df-ws7vm\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:51 crc kubenswrapper[4760]: I1204 12:33:51.214027 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.647071 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.650895 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.662511 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.662743 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wvcqf" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.662894 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.669937 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.671416 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.742891 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xwlm9"] Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.758082 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.759899 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.760053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.760357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.761032 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-config\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.761309 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.765114 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:52 crc kubenswrapper[4760]: I1204 12:33:52.765266 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhf9k\" (UniqueName: \"kubernetes.io/projected/b055fc8b-2181-441b-b4b3-efa345cfde65-kube-api-access-rhf9k\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:52.870819 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209338 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209469 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-config\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209509 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209530 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209561 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhf9k\" (UniqueName: \"kubernetes.io/projected/b055fc8b-2181-441b-b4b3-efa345cfde65-kube-api-access-rhf9k\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.209741 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.210467 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.220450 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.225616 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-config\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.225161 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b055fc8b-2181-441b-b4b3-efa345cfde65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.234069 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.259746 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.262024 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.305798 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhf9k\" (UniqueName: \"kubernetes.io/projected/b055fc8b-2181-441b-b4b3-efa345cfde65-kube-api-access-rhf9k\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.308122 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.333962 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b055fc8b-2181-441b-b4b3-efa345cfde65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b055fc8b-2181-441b-b4b3-efa345cfde65\") " pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:53 crc kubenswrapper[4760]: I1204 12:33:53.598889 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 12:33:54 crc kubenswrapper[4760]: W1204 12:33:54.658795 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4174c36_258e_4ed9_b6a7_f52818d3faed.slice/crio-b53391c0ef6e15b774a76ef433c4683748662f650bc07a94309d61ad0a7e5ea8 WatchSource:0}: Error finding container b53391c0ef6e15b774a76ef433c4683748662f650bc07a94309d61ad0a7e5ea8: Status 404 returned error can't find the container with id b53391c0ef6e15b774a76ef433c4683748662f650bc07a94309d61ad0a7e5ea8 Dec 04 12:33:55 crc kubenswrapper[4760]: I1204 12:33:55.636079 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xwlm9" event={"ID":"a4174c36-258e-4ed9-b6a7-f52818d3faed","Type":"ContainerStarted","Data":"b53391c0ef6e15b774a76ef433c4683748662f650bc07a94309d61ad0a7e5ea8"} Dec 04 12:34:00 crc kubenswrapper[4760]: I1204 12:34:00.062683 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:34:00 crc kubenswrapper[4760]: I1204 12:34:00.945760 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 12:34:07 crc kubenswrapper[4760]: I1204 12:34:07.950118 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b055fc8b-2181-441b-b4b3-efa345cfde65","Type":"ContainerStarted","Data":"999e8b4230c4e9e72f2cf2e6e4fbe27284bf8e211b8c25742f16ebcde8257354"} Dec 04 12:34:07 crc kubenswrapper[4760]: I1204 12:34:07.954906 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" event={"ID":"dbe589a8-e8b2-4c34-913d-cc529e207984","Type":"ContainerStarted","Data":"b853ae26f73fb50f4628edb7046f13ebc4c993fee91da76ad4cababc0f2415aa"} Dec 04 12:34:13 crc kubenswrapper[4760]: E1204 12:34:13.038160 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 12:34:13 crc kubenswrapper[4760]: E1204 12:34:13.039809 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tct7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(b6477a59-2dc3-4fff-907e-7e927cf257d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:13 crc kubenswrapper[4760]: E1204 12:34:13.041779 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" Dec 04 12:34:14 crc kubenswrapper[4760]: E1204 12:34:14.030005 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" Dec 04 12:34:15 crc kubenswrapper[4760]: E1204 12:34:15.457159 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 04 12:34:15 crc kubenswrapper[4760]: E1204 12:34:15.457951 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsptw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(9649c38c-ebc4-4103-aa55-c2aa867d6e26): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:15 crc kubenswrapper[4760]: E1204 12:34:15.459237 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="9649c38c-ebc4-4103-aa55-c2aa867d6e26" Dec 04 12:34:16 crc kubenswrapper[4760]: E1204 12:34:16.054924 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="9649c38c-ebc4-4103-aa55-c2aa867d6e26" Dec 04 12:34:17 crc kubenswrapper[4760]: E1204 12:34:17.925134 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 12:34:17 crc kubenswrapper[4760]: E1204 12:34:17.925968 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lg7pk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(62ceba36-f8bc-4644-978c-08a4cbf88ae5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:17 crc kubenswrapper[4760]: E1204 12:34:17.930314 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" Dec 04 12:34:18 crc kubenswrapper[4760]: E1204 12:34:18.075597 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" Dec 04 12:34:18 crc kubenswrapper[4760]: E1204 12:34:18.660701 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 04 12:34:18 crc kubenswrapper[4760]: E1204 12:34:18.660945 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n55fh5chf4h5b5hc8hddh5bch687hd4h687h7dh7dh9hc6hd7h56fhbhdfh8h5c5h5cdh558h648h546h8ch5c5h687h675h589h699hbh649q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f9gpw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(40657d96-3f5d-4da0-9783-845c41bfeaae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:18 crc kubenswrapper[4760]: E1204 12:34:18.662175 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="40657d96-3f5d-4da0-9783-845c41bfeaae" Dec 04 12:34:19 crc kubenswrapper[4760]: E1204 12:34:19.089779 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="40657d96-3f5d-4da0-9783-845c41bfeaae" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.180113 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.180406 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n84h56bhc4h57fh674h65ch9bh676h86h98h546h86h5c8h89h686h654hd6h5d5h54bh575h94hf5h65dh587h5bch5fh5d8h574h5d6h579h5c5h68cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8qmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-76hxp_openstack(7198d8b5-1a9e-45e7-8151-922d62c1e1f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.181590 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-76hxp" podUID="7198d8b5-1a9e-45e7-8151-922d62c1e1f0" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.206575 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.206918 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvw2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:20 crc kubenswrapper[4760]: E1204 12:34:20.208392 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff" Dec 04 12:34:21 crc kubenswrapper[4760]: E1204 12:34:21.105152 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff" Dec 04 12:34:21 crc kubenswrapper[4760]: E1204 12:34:21.105546 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-76hxp" podUID="7198d8b5-1a9e-45e7-8151-922d62c1e1f0" Dec 04 12:34:28 crc kubenswrapper[4760]: E1204 12:34:28.747719 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 04 12:34:28 crc kubenswrapper[4760]: E1204 12:34:28.749040 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n84h56bhc4h57fh674h65ch9bh676h86h98h546h86h5c8h89h686h654hd6h5d5h54bh575h94hf5h65dh587h5bch5fh5d8h574h5d6h579h5c5h68cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4v96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-frcmm_openstack(52334746-99b6-4056-a7d7-6df95b72d8de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:28 crc kubenswrapper[4760]: E1204 12:34:28.750354 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-frcmm" podUID="52334746-99b6-4056-a7d7-6df95b72d8de" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.088851 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.089135 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h55bh64ch656h586h56ch5bdh586h5ch64hd4h59fh5bhbch9ch57dh675h598h54h68dh59bhdh6ch54fh5fdh5f9h597h585h5d5hf9h577h98q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qw4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(591b6997-8c15-499c-8218-e222a178559e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.182442 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-frcmm" podUID="52334746-99b6-4056-a7d7-6df95b72d8de" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.315570 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.315852 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:n58dh5d4h5b7h5dh76h5bfh5d5h5fh6bh5f6h64fh5f5h586hd9h5fh64fh65dh56chdfh5bfh56bh675h578h5dbhch59h59fhdch5fh65bhdh69q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovs-rundir,ReadOnly:true,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:true,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4b2nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-metrics-xwlm9_openstack(a4174c36-258e-4ed9-b6a7-f52818d3faed): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:29 crc kubenswrapper[4760]: E1204 12:34:29.317131 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-metrics-xwlm9" podUID="a4174c36-258e-4ed9-b6a7-f52818d3faed" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.115088 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.115534 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7g48t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sf9zs_openstack(9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.116663 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" podUID="9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.194330 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovn-controller-metrics-xwlm9" podUID="a4174c36-258e-4ed9-b6a7-f52818d3faed" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.197357 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.197608 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdj7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-5t7bh_openstack(99f9835b-e77b-4032-9238-b41836e0c480): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.198855 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" podUID="99f9835b-e77b-4032-9238-b41836e0c480" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.333769 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.334612 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57h5bdh5b7h57h685h7h548h659h5fdh9fhb4h5c7h567hc7hcch66dh65dh686h698h9dh647h674h5dch598h64ch96h67fh576h55dh585h6fh55q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhf9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(b055fc8b-2181-441b-b4b3-efa345cfde65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.483685 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.483992 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkf27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-hjm6n_openstack(34e48143-4c17-45c3-ba03-f9d4a1d054be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.485337 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" podUID="34e48143-4c17-45c3-ba03-f9d4a1d054be" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.509135 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.509398 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86xf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-zpq26_openstack(cf889db1-7b40-45a5-b19b-f0992b77b406): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.512658 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.816074 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.816141 4760 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.816377 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rh66t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(d389a85e-f3e6-4ad5-b11d-9555d6aee3a8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 12:34:30 crc kubenswrapper[4760]: E1204 12:34:30.817602 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" Dec 04 12:34:30 crc kubenswrapper[4760]: I1204 12:34:30.930621 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.086739 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc\") pod \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.087446 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config\") pod \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.087567 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g48t\" (UniqueName: \"kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t\") pod \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\" (UID: \"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.088241 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0" (UID: "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.088461 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config" (OuterVolumeSpecName: "config") pod "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0" (UID: "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.094506 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t" (OuterVolumeSpecName: "kube-api-access-7g48t") pod "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0" (UID: "9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0"). InnerVolumeSpecName "kube-api-access-7g48t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.190844 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.190923 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.190939 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g48t\" (UniqueName: \"kubernetes.io/projected/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0-kube-api-access-7g48t\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.201804 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" event={"ID":"9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0","Type":"ContainerDied","Data":"685fb48ba0a8ab50777ec419cad6a13478ca68c6cf6cb6f219e296ec313cf13a"} Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.201845 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sf9zs" Dec 04 12:34:31 crc kubenswrapper[4760]: E1204 12:34:31.206314 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" Dec 04 12:34:31 crc kubenswrapper[4760]: E1204 12:34:31.303761 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="b055fc8b-2181-441b-b4b3-efa345cfde65" Dec 04 12:34:31 crc kubenswrapper[4760]: E1204 12:34:31.373557 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="591b6997-8c15-499c-8218-e222a178559e" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.434774 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.459773 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sf9zs"] Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.676751 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.707068 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.719338 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdj7d\" (UniqueName: \"kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d\") pod \"99f9835b-e77b-4032-9238-b41836e0c480\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.719471 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkf27\" (UniqueName: \"kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27\") pod \"34e48143-4c17-45c3-ba03-f9d4a1d054be\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.719537 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config\") pod \"34e48143-4c17-45c3-ba03-f9d4a1d054be\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.719577 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc\") pod \"34e48143-4c17-45c3-ba03-f9d4a1d054be\" (UID: \"34e48143-4c17-45c3-ba03-f9d4a1d054be\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.719646 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config\") pod \"99f9835b-e77b-4032-9238-b41836e0c480\" (UID: \"99f9835b-e77b-4032-9238-b41836e0c480\") " Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.720845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config" (OuterVolumeSpecName: "config") pod "99f9835b-e77b-4032-9238-b41836e0c480" (UID: "99f9835b-e77b-4032-9238-b41836e0c480"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.721517 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config" (OuterVolumeSpecName: "config") pod "34e48143-4c17-45c3-ba03-f9d4a1d054be" (UID: "34e48143-4c17-45c3-ba03-f9d4a1d054be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.721833 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34e48143-4c17-45c3-ba03-f9d4a1d054be" (UID: "34e48143-4c17-45c3-ba03-f9d4a1d054be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.731966 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27" (OuterVolumeSpecName: "kube-api-access-tkf27") pod "34e48143-4c17-45c3-ba03-f9d4a1d054be" (UID: "34e48143-4c17-45c3-ba03-f9d4a1d054be"). InnerVolumeSpecName "kube-api-access-tkf27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.736459 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d" (OuterVolumeSpecName: "kube-api-access-gdj7d") pod "99f9835b-e77b-4032-9238-b41836e0c480" (UID: "99f9835b-e77b-4032-9238-b41836e0c480"). InnerVolumeSpecName "kube-api-access-gdj7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.823220 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdj7d\" (UniqueName: \"kubernetes.io/projected/99f9835b-e77b-4032-9238-b41836e0c480-kube-api-access-gdj7d\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.823815 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkf27\" (UniqueName: \"kubernetes.io/projected/34e48143-4c17-45c3-ba03-f9d4a1d054be-kube-api-access-tkf27\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.823837 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.823850 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34e48143-4c17-45c3-ba03-f9d4a1d054be-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.823863 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f9835b-e77b-4032-9238-b41836e0c480-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:31 crc kubenswrapper[4760]: I1204 12:34:31.879739 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0" path="/var/lib/kubelet/pods/9c795ad0-30cb-4eb5-ace1-8db6aa79f8b0/volumes" Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.214949 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"591b6997-8c15-499c-8218-e222a178559e","Type":"ContainerStarted","Data":"b475301e0a02d61339df909d6c91420581442f1f9e53bbef95457f194532fa2f"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.217391 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" event={"ID":"34e48143-4c17-45c3-ba03-f9d4a1d054be","Type":"ContainerDied","Data":"1712061a2544d72050af96cd7daf23f7c6c7e49c5dca0bcd5523e04a39ff5624"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.217507 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hjm6n" Dec 04 12:34:32 crc kubenswrapper[4760]: E1204 12:34:32.217713 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="591b6997-8c15-499c-8218-e222a178559e" Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.219723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" event={"ID":"99f9835b-e77b-4032-9238-b41836e0c480","Type":"ContainerDied","Data":"349a1456a83d1433c2af601f805edef205f4c045577b7ee74a2d1f2c3b231e7e"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.219817 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5t7bh" Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.224471 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9649c38c-ebc4-4103-aa55-c2aa867d6e26","Type":"ContainerStarted","Data":"4a6f18f194f9cfd98aa620d45e3a452eb5c24ee29372a7ada3a26caade00c544"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.227016 4760 generic.go:334] "Generic (PLEG): container finished" podID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerID="ac4c2b698f67896fc28744d6612c6dd89e34346b612eda053795cf9dde190acc" exitCode=0 Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.227101 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" event={"ID":"cf889db1-7b40-45a5-b19b-f0992b77b406","Type":"ContainerDied","Data":"ac4c2b698f67896fc28744d6612c6dd89e34346b612eda053795cf9dde190acc"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.231152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b055fc8b-2181-441b-b4b3-efa345cfde65","Type":"ContainerStarted","Data":"5cef0cf0c7d805c4b2469f970faf7b8369fde0b53aa86e3f4b7af272ae8ff265"} Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.286294 4760 generic.go:334] "Generic (PLEG): container finished" podID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerID="e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7" exitCode=0 Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.286400 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" event={"ID":"dbe589a8-e8b2-4c34-913d-cc529e207984","Type":"ContainerDied","Data":"e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7"} Dec 04 12:34:32 crc kubenswrapper[4760]: E1204 12:34:32.456515 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="b055fc8b-2181-441b-b4b3-efa345cfde65" Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.536896 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.548404 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5t7bh"] Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.689565 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:34:32 crc kubenswrapper[4760]: I1204 12:34:32.698835 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hjm6n"] Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.300061 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" event={"ID":"dbe589a8-e8b2-4c34-913d-cc529e207984","Type":"ContainerStarted","Data":"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a"} Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.300309 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.303833 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"40657d96-3f5d-4da0-9783-845c41bfeaae","Type":"ContainerStarted","Data":"41915af65209f6ce8422f1ad0f874c96314928acfb136d21eeda94f1019c834b"} Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.304194 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.306848 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerStarted","Data":"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc"} Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.310120 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerStarted","Data":"c643d91bc9dc7bde5a3c757a5b02bc7fd000256f842eb4fe2df55b32bc742ec2"} Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.313350 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" event={"ID":"cf889db1-7b40-45a5-b19b-f0992b77b406","Type":"ContainerStarted","Data":"da4fcd0b879fc73bf61cb71521a1f4880a6ee6073ff12ac2f7130c072326a55e"} Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.313637 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.316086 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff","Type":"ContainerStarted","Data":"ae8d77e24ddfcdca9da6eca05da5e054083dbd4e5986f8ab9486379d59d9aecd"} Dec 04 12:34:33 crc kubenswrapper[4760]: E1204 12:34:33.320471 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="591b6997-8c15-499c-8218-e222a178559e" Dec 04 12:34:33 crc kubenswrapper[4760]: E1204 12:34:33.320471 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="b055fc8b-2181-441b-b4b3-efa345cfde65" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.341061 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" podStartSLOduration=19.675065398 podStartE2EDuration="43.341009759s" podCreationTimestamp="2025-12-04 12:33:50 +0000 UTC" firstStartedPulling="2025-12-04 12:34:07.254935237 +0000 UTC m=+1250.296381804" lastFinishedPulling="2025-12-04 12:34:30.920879598 +0000 UTC m=+1273.962326165" observedRunningTime="2025-12-04 12:34:33.326811749 +0000 UTC m=+1276.368258326" watchObservedRunningTime="2025-12-04 12:34:33.341009759 +0000 UTC m=+1276.382456326" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.479238 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=4.788004033 podStartE2EDuration="53.479188096s" podCreationTimestamp="2025-12-04 12:33:40 +0000 UTC" firstStartedPulling="2025-12-04 12:33:43.767428157 +0000 UTC m=+1226.808874724" lastFinishedPulling="2025-12-04 12:34:32.45861222 +0000 UTC m=+1275.500058787" observedRunningTime="2025-12-04 12:34:33.478608197 +0000 UTC m=+1276.520054784" watchObservedRunningTime="2025-12-04 12:34:33.479188096 +0000 UTC m=+1276.520634663" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.530825 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" podStartSLOduration=-9223371979.323973 podStartE2EDuration="57.530802275s" podCreationTimestamp="2025-12-04 12:33:36 +0000 UTC" firstStartedPulling="2025-12-04 12:33:37.408020512 +0000 UTC m=+1220.449467089" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:34:33.529434661 +0000 UTC m=+1276.570881228" watchObservedRunningTime="2025-12-04 12:34:33.530802275 +0000 UTC m=+1276.572248852" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.877964 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e48143-4c17-45c3-ba03-f9d4a1d054be" path="/var/lib/kubelet/pods/34e48143-4c17-45c3-ba03-f9d4a1d054be/volumes" Dec 04 12:34:33 crc kubenswrapper[4760]: I1204 12:34:33.878587 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f9835b-e77b-4032-9238-b41836e0c480" path="/var/lib/kubelet/pods/99f9835b-e77b-4032-9238-b41836e0c480/volumes" Dec 04 12:34:35 crc kubenswrapper[4760]: I1204 12:34:35.336505 4760 generic.go:334] "Generic (PLEG): container finished" podID="7198d8b5-1a9e-45e7-8151-922d62c1e1f0" containerID="5fd5a34cd6a5cae785f53800e865cd97fff60370a9f523966ca07288ee961eb3" exitCode=0 Dec 04 12:34:35 crc kubenswrapper[4760]: I1204 12:34:35.336583 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-76hxp" event={"ID":"7198d8b5-1a9e-45e7-8151-922d62c1e1f0","Type":"ContainerDied","Data":"5fd5a34cd6a5cae785f53800e865cd97fff60370a9f523966ca07288ee961eb3"} Dec 04 12:34:35 crc kubenswrapper[4760]: I1204 12:34:35.341823 4760 generic.go:334] "Generic (PLEG): container finished" podID="9649c38c-ebc4-4103-aa55-c2aa867d6e26" containerID="4a6f18f194f9cfd98aa620d45e3a452eb5c24ee29372a7ada3a26caade00c544" exitCode=0 Dec 04 12:34:35 crc kubenswrapper[4760]: I1204 12:34:35.341899 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9649c38c-ebc4-4103-aa55-c2aa867d6e26","Type":"ContainerDied","Data":"4a6f18f194f9cfd98aa620d45e3a452eb5c24ee29372a7ada3a26caade00c544"} Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.357691 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-76hxp" event={"ID":"7198d8b5-1a9e-45e7-8151-922d62c1e1f0","Type":"ContainerStarted","Data":"ed27f8126d08605423355ae5c263aebfc49dccdf081f2dfdb4025b4c43edde1b"} Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.358654 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-76hxp" event={"ID":"7198d8b5-1a9e-45e7-8151-922d62c1e1f0","Type":"ContainerStarted","Data":"10e7849651487a74691dd4f06e738a0e66b60a9be970153c73fc13387fc69b5a"} Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.359454 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.359486 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.362575 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9649c38c-ebc4-4103-aa55-c2aa867d6e26","Type":"ContainerStarted","Data":"306589b7246bf59fffac412d35b9f5aec549045969806099b61f22d7213cda52"} Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.399397 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-76hxp" podStartSLOduration=5.54108683 podStartE2EDuration="50.399362221s" podCreationTimestamp="2025-12-04 12:33:46 +0000 UTC" firstStartedPulling="2025-12-04 12:33:49.435783081 +0000 UTC m=+1232.477229638" lastFinishedPulling="2025-12-04 12:34:34.294058462 +0000 UTC m=+1277.335505029" observedRunningTime="2025-12-04 12:34:36.390945164 +0000 UTC m=+1279.432391741" watchObservedRunningTime="2025-12-04 12:34:36.399362221 +0000 UTC m=+1279.440808788" Dec 04 12:34:36 crc kubenswrapper[4760]: I1204 12:34:36.419150 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.084867304 podStartE2EDuration="59.419117168s" podCreationTimestamp="2025-12-04 12:33:37 +0000 UTC" firstStartedPulling="2025-12-04 12:33:40.500027255 +0000 UTC m=+1223.541473822" lastFinishedPulling="2025-12-04 12:34:30.834277119 +0000 UTC m=+1273.875723686" observedRunningTime="2025-12-04 12:34:36.416428243 +0000 UTC m=+1279.457874830" watchObservedRunningTime="2025-12-04 12:34:36.419117168 +0000 UTC m=+1279.460563735" Dec 04 12:34:37 crc kubenswrapper[4760]: I1204 12:34:37.079136 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 12:34:37 crc kubenswrapper[4760]: I1204 12:34:37.395233 4760 generic.go:334] "Generic (PLEG): container finished" podID="db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff" containerID="ae8d77e24ddfcdca9da6eca05da5e054083dbd4e5986f8ab9486379d59d9aecd" exitCode=0 Dec 04 12:34:37 crc kubenswrapper[4760]: I1204 12:34:37.395366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff","Type":"ContainerDied","Data":"ae8d77e24ddfcdca9da6eca05da5e054083dbd4e5986f8ab9486379d59d9aecd"} Dec 04 12:34:38 crc kubenswrapper[4760]: I1204 12:34:38.406637 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff","Type":"ContainerStarted","Data":"67a7ceb0b1bc1cfb2c2636d3a85946fdd14f37f96ce78987314933fdc82dc0ca"} Dec 04 12:34:38 crc kubenswrapper[4760]: I1204 12:34:38.455290 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371976.399519 podStartE2EDuration="1m0.455256882s" podCreationTimestamp="2025-12-04 12:33:38 +0000 UTC" firstStartedPulling="2025-12-04 12:33:43.239062141 +0000 UTC m=+1226.280508708" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:34:38.443030794 +0000 UTC m=+1281.484477371" watchObservedRunningTime="2025-12-04 12:34:38.455256882 +0000 UTC m=+1281.496703459" Dec 04 12:34:39 crc kubenswrapper[4760]: I1204 12:34:39.461965 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 12:34:39 crc kubenswrapper[4760]: I1204 12:34:39.462297 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 12:34:40 crc kubenswrapper[4760]: I1204 12:34:40.693138 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 12:34:40 crc kubenswrapper[4760]: I1204 12:34:40.693734 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.216584 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.295741 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.295997 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="dnsmasq-dns" containerID="cri-o://da4fcd0b879fc73bf61cb71521a1f4880a6ee6073ff12ac2f7130c072326a55e" gracePeriod=10 Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.304256 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.456393 4760 generic.go:334] "Generic (PLEG): container finished" podID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerID="da4fcd0b879fc73bf61cb71521a1f4880a6ee6073ff12ac2f7130c072326a55e" exitCode=0 Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.456485 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" event={"ID":"cf889db1-7b40-45a5-b19b-f0992b77b406","Type":"ContainerDied","Data":"da4fcd0b879fc73bf61cb71521a1f4880a6ee6073ff12ac2f7130c072326a55e"} Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.794961 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.888782 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86xf2\" (UniqueName: \"kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2\") pod \"cf889db1-7b40-45a5-b19b-f0992b77b406\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.888970 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc\") pod \"cf889db1-7b40-45a5-b19b-f0992b77b406\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.889033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config\") pod \"cf889db1-7b40-45a5-b19b-f0992b77b406\" (UID: \"cf889db1-7b40-45a5-b19b-f0992b77b406\") " Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.896867 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2" (OuterVolumeSpecName: "kube-api-access-86xf2") pod "cf889db1-7b40-45a5-b19b-f0992b77b406" (UID: "cf889db1-7b40-45a5-b19b-f0992b77b406"). InnerVolumeSpecName "kube-api-access-86xf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.933289 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config" (OuterVolumeSpecName: "config") pod "cf889db1-7b40-45a5-b19b-f0992b77b406" (UID: "cf889db1-7b40-45a5-b19b-f0992b77b406"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:41 crc kubenswrapper[4760]: I1204 12:34:41.934874 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf889db1-7b40-45a5-b19b-f0992b77b406" (UID: "cf889db1-7b40-45a5-b19b-f0992b77b406"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.305327 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86xf2\" (UniqueName: \"kubernetes.io/projected/cf889db1-7b40-45a5-b19b-f0992b77b406-kube-api-access-86xf2\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.305726 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.305741 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf889db1-7b40-45a5-b19b-f0992b77b406-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.339703 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" event={"ID":"cf889db1-7b40-45a5-b19b-f0992b77b406","Type":"ContainerDied","Data":"11547ad189443acb2bb63d01ca97d91608e2c0f53574533c5ece517a6cef455d"} Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.339860 4760 scope.go:117] "RemoveContainer" containerID="da4fcd0b879fc73bf61cb71521a1f4880a6ee6073ff12ac2f7130c072326a55e" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.340255 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.386643 4760 scope.go:117] "RemoveContainer" containerID="ac4c2b698f67896fc28744d6612c6dd89e34346b612eda053795cf9dde190acc" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.394359 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.403176 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-zpq26"] Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.876947 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" path="/var/lib/kubelet/pods/cf889db1-7b40-45a5-b19b-f0992b77b406/volumes" Dec 04 12:34:43 crc kubenswrapper[4760]: I1204 12:34:43.931879 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 12:34:44 crc kubenswrapper[4760]: I1204 12:34:44.077296 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 12:34:44 crc kubenswrapper[4760]: I1204 12:34:44.800023 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 12:34:44 crc kubenswrapper[4760]: I1204 12:34:44.930854 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.372051 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8","Type":"ContainerStarted","Data":"61f3caafe7376c0e45c33ece0a9e92fd8c81d842dd36a3ecc67f258cd02a7e7a"} Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.373003 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.374947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-frcmm" event={"ID":"52334746-99b6-4056-a7d7-6df95b72d8de","Type":"ContainerStarted","Data":"9ed9550ad59cdb37ca647b023a992e34685dbd593a8bf78cb7fed6ac790f7795"} Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.375257 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-frcmm" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.378441 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xwlm9" event={"ID":"a4174c36-258e-4ed9-b6a7-f52818d3faed","Type":"ContainerStarted","Data":"3727cf76194b942932c934a61c437c0f77636d823f07aa09f39aa38ba2302498"} Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.398501 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.916820163 podStartE2EDuration="1m0.39846929s" podCreationTimestamp="2025-12-04 12:33:45 +0000 UTC" firstStartedPulling="2025-12-04 12:33:47.801717457 +0000 UTC m=+1230.843164024" lastFinishedPulling="2025-12-04 12:34:44.283366594 +0000 UTC m=+1287.324813151" observedRunningTime="2025-12-04 12:34:45.39718818 +0000 UTC m=+1288.438634757" watchObservedRunningTime="2025-12-04 12:34:45.39846929 +0000 UTC m=+1288.439915867" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.426811 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-frcmm" podStartSLOduration=4.6114067389999995 podStartE2EDuration="59.426782239s" podCreationTimestamp="2025-12-04 12:33:46 +0000 UTC" firstStartedPulling="2025-12-04 12:33:49.4703787 +0000 UTC m=+1232.511825267" lastFinishedPulling="2025-12-04 12:34:44.2857542 +0000 UTC m=+1287.327200767" observedRunningTime="2025-12-04 12:34:45.42115188 +0000 UTC m=+1288.462598447" watchObservedRunningTime="2025-12-04 12:34:45.426782239 +0000 UTC m=+1288.468228806" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.447606 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xwlm9" podStartSLOduration=-9223371981.407206 podStartE2EDuration="55.447570329s" podCreationTimestamp="2025-12-04 12:33:50 +0000 UTC" firstStartedPulling="2025-12-04 12:33:54.664844809 +0000 UTC m=+1237.706291376" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:34:45.442967524 +0000 UTC m=+1288.484414091" watchObservedRunningTime="2025-12-04 12:34:45.447570329 +0000 UTC m=+1288.489016896" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.929628 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-bkqwt"] Dec 04 12:34:45 crc kubenswrapper[4760]: E1204 12:34:45.930118 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="init" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.930140 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="init" Dec 04 12:34:45 crc kubenswrapper[4760]: E1204 12:34:45.930154 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="dnsmasq-dns" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.930162 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="dnsmasq-dns" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.930451 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="dnsmasq-dns" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.937522 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:45 crc kubenswrapper[4760]: I1204 12:34:45.965629 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-bkqwt"] Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.066763 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.066851 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8vd\" (UniqueName: \"kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.067179 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.068416 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.171700 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-bkqwt"] Dec 04 12:34:46 crc kubenswrapper[4760]: E1204 12:34:46.172755 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-5s8vd ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" podUID="435e7ea6-84c8-47b2-bff7-af011b8c7b43" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.173438 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.173592 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8vd\" (UniqueName: \"kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.173760 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.174002 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.175194 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.175955 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.176758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.202561 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8vd\" (UniqueName: \"kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd\") pod \"dnsmasq-dns-74f6f696b9-bkqwt\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.244545 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.246410 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.255557 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.269422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.379129 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.379280 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.379322 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.379362 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.379385 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwsg\" (UniqueName: \"kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.389553 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.456846 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.480844 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.480976 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.481009 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.481045 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.481068 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwsg\" (UniqueName: \"kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.482332 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.482330 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.482571 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.482735 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.509703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwsg\" (UniqueName: \"kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg\") pod \"dnsmasq-dns-698758b865-f9wkq\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.585589 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc\") pod \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.585785 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb\") pod \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.585986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config\") pod \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.586027 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s8vd\" (UniqueName: \"kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd\") pod \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\" (UID: \"435e7ea6-84c8-47b2-bff7-af011b8c7b43\") " Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.587083 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "435e7ea6-84c8-47b2-bff7-af011b8c7b43" (UID: "435e7ea6-84c8-47b2-bff7-af011b8c7b43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.587237 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config" (OuterVolumeSpecName: "config") pod "435e7ea6-84c8-47b2-bff7-af011b8c7b43" (UID: "435e7ea6-84c8-47b2-bff7-af011b8c7b43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.587270 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.587457 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "435e7ea6-84c8-47b2-bff7-af011b8c7b43" (UID: "435e7ea6-84c8-47b2-bff7-af011b8c7b43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.590789 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-zpq26" podUID="cf889db1-7b40-45a5-b19b-f0992b77b406" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: i/o timeout" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.592362 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd" (OuterVolumeSpecName: "kube-api-access-5s8vd") pod "435e7ea6-84c8-47b2-bff7-af011b8c7b43" (UID: "435e7ea6-84c8-47b2-bff7-af011b8c7b43"). InnerVolumeSpecName "kube-api-access-5s8vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.688528 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.689015 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.689031 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s8vd\" (UniqueName: \"kubernetes.io/projected/435e7ea6-84c8-47b2-bff7-af011b8c7b43-kube-api-access-5s8vd\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.689046 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/435e7ea6-84c8-47b2-bff7-af011b8c7b43-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:46 crc kubenswrapper[4760]: I1204 12:34:46.903989 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.010138 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.035734 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.036435 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.040730 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.043499 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.044390 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4fmr8" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.044648 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.103109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-cache\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.103240 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-lock\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.103375 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5pgq\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-kube-api-access-b5pgq\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.103417 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.103444 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.195139 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m4x4d"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.197540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.204393 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.204643 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.209376 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.209429 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.209557 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-cache\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.209627 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-lock\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.209729 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5pgq\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-kube-api-access-b5pgq\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.210386 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.210409 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.210468 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:34:47.710444088 +0000 UTC m=+1290.751890655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.210908 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.212291 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-lock\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.212659 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ad805440-7d12-4b3e-b11b-c37463e95bb7-cache\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.219273 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.258417 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5pgq\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-kube-api-access-b5pgq\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.268756 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.312006 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.312617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpcsg\" (UniqueName: \"kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.312716 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.312898 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.313020 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.313144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.313302 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.336645 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m4x4d"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416104 4760 generic.go:334] "Generic (PLEG): container finished" podID="f65c3674-23c1-453d-8439-079822a3eb3c" containerID="417e2bf2d1e0adc914a049497ab436b90e940a51b5a4ab6b6cea7f6b9a516e6d" exitCode=0 Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416551 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416619 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416661 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpcsg\" (UniqueName: \"kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416692 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416811 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416875 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.416882 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f9wkq" event={"ID":"f65c3674-23c1-453d-8439-079822a3eb3c","Type":"ContainerDied","Data":"417e2bf2d1e0adc914a049497ab436b90e940a51b5a4ab6b6cea7f6b9a516e6d"} Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.417069 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f9wkq" event={"ID":"f65c3674-23c1-453d-8439-079822a3eb3c","Type":"ContainerStarted","Data":"df49979df03411a4afc00f507c9afa7b31708ce8198484f890a2b28276bc87cf"} Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.418009 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.418408 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.419017 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.427800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.433745 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-bkqwt" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.434176 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b055fc8b-2181-441b-b4b3-efa345cfde65","Type":"ContainerStarted","Data":"f80f9dff160605c5480faff3a8103a9d7c1471ad18f8e702f7aa72f03049d22d"} Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.456857 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.461513 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.479257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpcsg\" (UniqueName: \"kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg\") pod \"swift-ring-rebalance-m4x4d\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.501336 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.480360665 podStartE2EDuration="56.501198487s" podCreationTimestamp="2025-12-04 12:33:51 +0000 UTC" firstStartedPulling="2025-12-04 12:34:07.253690628 +0000 UTC m=+1250.295137195" lastFinishedPulling="2025-12-04 12:34:46.27452846 +0000 UTC m=+1289.315975017" observedRunningTime="2025-12-04 12:34:47.499250136 +0000 UTC m=+1290.540696713" watchObservedRunningTime="2025-12-04 12:34:47.501198487 +0000 UTC m=+1290.542645044" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.590387 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-bkqwt"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.600317 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.600764 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-bkqwt"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.708011 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.724242 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.724564 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.724610 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: E1204 12:34:47.724743 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:34:48.724686562 +0000 UTC m=+1291.766133129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.862602 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qnxc5"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.864448 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.908082 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435e7ea6-84c8-47b2-bff7-af011b8c7b43" path="/var/lib/kubelet/pods/435e7ea6-84c8-47b2-bff7-af011b8c7b43/volumes" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.916263 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-069b-account-create-update-d2dm2"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.917599 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qnxc5"] Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.917717 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.921462 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 12:34:47 crc kubenswrapper[4760]: I1204 12:34:47.941569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-069b-account-create-update-d2dm2"] Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.046428 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.046548 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.046594 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whvj\" (UniqueName: \"kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.046635 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44x7v\" (UniqueName: \"kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.148587 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.149113 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.149299 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whvj\" (UniqueName: \"kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.149500 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44x7v\" (UniqueName: \"kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.151624 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.152472 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.172078 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44x7v\" (UniqueName: \"kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v\") pod \"glance-069b-account-create-update-d2dm2\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.172610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whvj\" (UniqueName: \"kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj\") pod \"glance-db-create-qnxc5\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.206900 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.253244 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m4x4d"] Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.255824 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:48 crc kubenswrapper[4760]: W1204 12:34:48.266417 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice/crio-5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939 WatchSource:0}: Error finding container 5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939: Status 404 returned error can't find the container with id 5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939 Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.456808 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m4x4d" event={"ID":"4ce3174d-015c-4a85-b58d-af7603479902","Type":"ContainerStarted","Data":"5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939"} Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.463728 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f9wkq" event={"ID":"f65c3674-23c1-453d-8439-079822a3eb3c","Type":"ContainerStarted","Data":"818e5deaa954e765ce9d8c324af281c2674f1d35350b23f6c19209721e99294e"} Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.463957 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.516395 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-f9wkq" podStartSLOduration=2.516364012 podStartE2EDuration="2.516364012s" podCreationTimestamp="2025-12-04 12:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:34:48.514649737 +0000 UTC m=+1291.556096304" watchObservedRunningTime="2025-12-04 12:34:48.516364012 +0000 UTC m=+1291.557810579" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.600337 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.653046 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-069b-account-create-update-d2dm2"] Dec 04 12:34:48 crc kubenswrapper[4760]: W1204 12:34:48.657548 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc55a1d86_936b_439f_aef3_a2e55a596cd6.slice/crio-6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba WatchSource:0}: Error finding container 6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba: Status 404 returned error can't find the container with id 6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.744928 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qnxc5"] Dec 04 12:34:48 crc kubenswrapper[4760]: I1204 12:34:48.768187 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:48 crc kubenswrapper[4760]: E1204 12:34:48.768629 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:34:48 crc kubenswrapper[4760]: E1204 12:34:48.768651 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:34:48 crc kubenswrapper[4760]: E1204 12:34:48.768724 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:34:50.768701552 +0000 UTC m=+1293.810148119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.486458 4760 generic.go:334] "Generic (PLEG): container finished" podID="c55a1d86-936b-439f-aef3-a2e55a596cd6" containerID="3ced422b3b35dcf011f420f9462005fbd18b90cd9806383302c006679292daee" exitCode=0 Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.487099 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-069b-account-create-update-d2dm2" event={"ID":"c55a1d86-936b-439f-aef3-a2e55a596cd6","Type":"ContainerDied","Data":"3ced422b3b35dcf011f420f9462005fbd18b90cd9806383302c006679292daee"} Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.487145 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-069b-account-create-update-d2dm2" event={"ID":"c55a1d86-936b-439f-aef3-a2e55a596cd6","Type":"ContainerStarted","Data":"6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba"} Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.493505 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"591b6997-8c15-499c-8218-e222a178559e","Type":"ContainerStarted","Data":"45239ae8eeadb1953387bffa18aa4e494041ef33a1b75d40b9fdc42a520175a4"} Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.495938 4760 generic.go:334] "Generic (PLEG): container finished" podID="a470b427-0c54-4814-afcc-189f86207a0c" containerID="29b91ba4f26a2f55c7ef89bec053bdfd9259fc178a4f2eefe549440980445bdf" exitCode=0 Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.496068 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qnxc5" event={"ID":"a470b427-0c54-4814-afcc-189f86207a0c","Type":"ContainerDied","Data":"29b91ba4f26a2f55c7ef89bec053bdfd9259fc178a4f2eefe549440980445bdf"} Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.497388 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qnxc5" event={"ID":"a470b427-0c54-4814-afcc-189f86207a0c","Type":"ContainerStarted","Data":"7a94b80d9252032a490d6855551515ff6730aaf467fba19cc497f2fdffa15a8f"} Dec 04 12:34:49 crc kubenswrapper[4760]: I1204 12:34:49.570305 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.916696788 podStartE2EDuration="1m4.570268766s" podCreationTimestamp="2025-12-04 12:33:45 +0000 UTC" firstStartedPulling="2025-12-04 12:33:49.692681655 +0000 UTC m=+1232.734128222" lastFinishedPulling="2025-12-04 12:34:48.346253633 +0000 UTC m=+1291.387700200" observedRunningTime="2025-12-04 12:34:49.563313916 +0000 UTC m=+1292.604760483" watchObservedRunningTime="2025-12-04 12:34:49.570268766 +0000 UTC m=+1292.611715353" Dec 04 12:34:50 crc kubenswrapper[4760]: I1204 12:34:50.666509 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 12:34:50 crc kubenswrapper[4760]: I1204 12:34:50.806812 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 12:34:50 crc kubenswrapper[4760]: I1204 12:34:50.830347 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:50 crc kubenswrapper[4760]: E1204 12:34:50.830895 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:34:50 crc kubenswrapper[4760]: E1204 12:34:50.830929 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:34:50 crc kubenswrapper[4760]: E1204 12:34:50.831009 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:34:54.830983005 +0000 UTC m=+1297.872429582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.131841 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mnp4l"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.133509 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.149749 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6133-account-create-update-gpcjt"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.151870 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.159443 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.163567 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mnp4l"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.217395 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6133-account-create-update-gpcjt"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.238904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.238997 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6fc\" (UniqueName: \"kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.239033 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5k4d\" (UniqueName: \"kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.239135 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.341225 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6fc\" (UniqueName: \"kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.341310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5k4d\" (UniqueName: \"kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.341410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.341492 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.342905 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.342926 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.366935 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5k4d\" (UniqueName: \"kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d\") pod \"keystone-db-create-mnp4l\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.374807 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6fc\" (UniqueName: \"kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc\") pod \"keystone-6133-account-create-update-gpcjt\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.487222 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.512097 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.571952 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wd987"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.573525 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.586375 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wd987"] Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.651248 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhf9n\" (UniqueName: \"kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.651444 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.754132 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhf9n\" (UniqueName: \"kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.754310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.755718 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.774793 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhf9n\" (UniqueName: \"kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n\") pod \"placement-db-create-wd987\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " pod="openstack/placement-db-create-wd987" Dec 04 12:34:51 crc kubenswrapper[4760]: I1204 12:34:51.909950 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wd987" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.213965 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3e42-account-create-update-h6rgv"] Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.215700 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.219889 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.230015 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3e42-account-create-update-h6rgv"] Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.236024 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.252443 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.266627 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whvj\" (UniqueName: \"kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj\") pod \"a470b427-0c54-4814-afcc-189f86207a0c\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.267087 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts\") pod \"a470b427-0c54-4814-afcc-189f86207a0c\" (UID: \"a470b427-0c54-4814-afcc-189f86207a0c\") " Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.270305 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a470b427-0c54-4814-afcc-189f86207a0c" (UID: "a470b427-0c54-4814-afcc-189f86207a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.274933 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.275992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn5x6\" (UniqueName: \"kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.276174 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj" (OuterVolumeSpecName: "kube-api-access-8whvj") pod "a470b427-0c54-4814-afcc-189f86207a0c" (UID: "a470b427-0c54-4814-afcc-189f86207a0c"). InnerVolumeSpecName "kube-api-access-8whvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.277534 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a470b427-0c54-4814-afcc-189f86207a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.277598 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whvj\" (UniqueName: \"kubernetes.io/projected/a470b427-0c54-4814-afcc-189f86207a0c-kube-api-access-8whvj\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.379357 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts\") pod \"c55a1d86-936b-439f-aef3-a2e55a596cd6\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.379967 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44x7v\" (UniqueName: \"kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v\") pod \"c55a1d86-936b-439f-aef3-a2e55a596cd6\" (UID: \"c55a1d86-936b-439f-aef3-a2e55a596cd6\") " Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.380122 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c55a1d86-936b-439f-aef3-a2e55a596cd6" (UID: "c55a1d86-936b-439f-aef3-a2e55a596cd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.380353 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn5x6\" (UniqueName: \"kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.380491 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.380592 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c55a1d86-936b-439f-aef3-a2e55a596cd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.383621 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.389465 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v" (OuterVolumeSpecName: "kube-api-access-44x7v") pod "c55a1d86-936b-439f-aef3-a2e55a596cd6" (UID: "c55a1d86-936b-439f-aef3-a2e55a596cd6"). InnerVolumeSpecName "kube-api-access-44x7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.404868 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn5x6\" (UniqueName: \"kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6\") pod \"placement-3e42-account-create-update-h6rgv\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.483804 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44x7v\" (UniqueName: \"kubernetes.io/projected/c55a1d86-936b-439f-aef3-a2e55a596cd6-kube-api-access-44x7v\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.550748 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-069b-account-create-update-d2dm2" event={"ID":"c55a1d86-936b-439f-aef3-a2e55a596cd6","Type":"ContainerDied","Data":"6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba"} Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.550801 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6525d063a89583399510999dd5e151f12a822a314392f0548404261c5382a6ba" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.551027 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-069b-account-create-update-d2dm2" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.561744 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.563120 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qnxc5" event={"ID":"a470b427-0c54-4814-afcc-189f86207a0c","Type":"ContainerDied","Data":"7a94b80d9252032a490d6855551515ff6730aaf467fba19cc497f2fdffa15a8f"} Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.563166 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a94b80d9252032a490d6855551515ff6730aaf467fba19cc497f2fdffa15a8f" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.563252 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qnxc5" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.795747 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6133-account-create-update-gpcjt"] Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.805727 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.896265 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wd987"] Dec 04 12:34:52 crc kubenswrapper[4760]: I1204 12:34:52.989374 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mnp4l"] Dec 04 12:34:52 crc kubenswrapper[4760]: W1204 12:34:52.999588 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda484ab2a_7803_4ce1_8ba6_32942c29c4d9.slice/crio-235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360 WatchSource:0}: Error finding container 235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360: Status 404 returned error can't find the container with id 235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360 Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.188337 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3e42-account-create-update-h6rgv"] Dec 04 12:34:53 crc kubenswrapper[4760]: W1204 12:34:53.193654 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b26f6a_29a2_4c94_8606_0b8d925489aa.slice/crio-009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d WatchSource:0}: Error finding container 009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d: Status 404 returned error can't find the container with id 009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.582714 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m4x4d" event={"ID":"4ce3174d-015c-4a85-b58d-af7603479902","Type":"ContainerStarted","Data":"c60de42ac34004b15b2af639872103dddc9d22c78376f01c2f9c4dc0c121aa02"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.586249 4760 generic.go:334] "Generic (PLEG): container finished" podID="a484ab2a-7803-4ce1-8ba6-32942c29c4d9" containerID="4ca983cdb6d8314e316ade854fe7c3ffb4778bffa38b0a188100360d08282d81" exitCode=0 Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.586539 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnp4l" event={"ID":"a484ab2a-7803-4ce1-8ba6-32942c29c4d9","Type":"ContainerDied","Data":"4ca983cdb6d8314e316ade854fe7c3ffb4778bffa38b0a188100360d08282d81"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.586640 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnp4l" event={"ID":"a484ab2a-7803-4ce1-8ba6-32942c29c4d9","Type":"ContainerStarted","Data":"235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.590433 4760 generic.go:334] "Generic (PLEG): container finished" podID="4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" containerID="46fb2cf46ada2f5103daba9a6a96a0b17b74c02c4bb8c8201176ed75f1269595" exitCode=0 Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.590588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wd987" event={"ID":"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863","Type":"ContainerDied","Data":"46fb2cf46ada2f5103daba9a6a96a0b17b74c02c4bb8c8201176ed75f1269595"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.590646 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wd987" event={"ID":"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863","Type":"ContainerStarted","Data":"eb06769711f6ba8369810948a656f9c802afec2365c502d0d53d34049256cebe"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.592569 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e42-account-create-update-h6rgv" event={"ID":"b5b26f6a-29a2-4c94-8606-0b8d925489aa","Type":"ContainerStarted","Data":"b5cdd98a02b38ffb70b102faa8289a81b610f49b6f2876055aca2d540f66beb4"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.592602 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e42-account-create-update-h6rgv" event={"ID":"b5b26f6a-29a2-4c94-8606-0b8d925489aa","Type":"ContainerStarted","Data":"009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.594608 4760 generic.go:334] "Generic (PLEG): container finished" podID="2af69cbd-7106-43d8-9a09-e55a18ffa2bb" containerID="b6ec6a304182e8a5c7440e1e6f632b72c538f3ec95d19031df662d96819be230" exitCode=0 Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.594640 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6133-account-create-update-gpcjt" event={"ID":"2af69cbd-7106-43d8-9a09-e55a18ffa2bb","Type":"ContainerDied","Data":"b6ec6a304182e8a5c7440e1e6f632b72c538f3ec95d19031df662d96819be230"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.594654 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6133-account-create-update-gpcjt" event={"ID":"2af69cbd-7106-43d8-9a09-e55a18ffa2bb","Type":"ContainerStarted","Data":"13fd9c115bbf47731e633f8ccd774d2a12a68fb450981d428a183d5b3afa543d"} Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.618201 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m4x4d" podStartSLOduration=2.6133229350000002 podStartE2EDuration="6.618173231s" podCreationTimestamp="2025-12-04 12:34:47 +0000 UTC" firstStartedPulling="2025-12-04 12:34:48.278069388 +0000 UTC m=+1291.319515955" lastFinishedPulling="2025-12-04 12:34:52.282919684 +0000 UTC m=+1295.324366251" observedRunningTime="2025-12-04 12:34:53.610996574 +0000 UTC m=+1296.652443161" watchObservedRunningTime="2025-12-04 12:34:53.618173231 +0000 UTC m=+1296.659619798" Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.659092 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.723421 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3e42-account-create-update-h6rgv" podStartSLOduration=1.7233879010000002 podStartE2EDuration="1.723387901s" podCreationTimestamp="2025-12-04 12:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:34:53.708818059 +0000 UTC m=+1296.750264626" watchObservedRunningTime="2025-12-04 12:34:53.723387901 +0000 UTC m=+1296.764834468" Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.885018 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 12:34:53 crc kubenswrapper[4760]: I1204 12:34:53.938017 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.228816 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 12:34:54 crc kubenswrapper[4760]: E1204 12:34:54.229444 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a470b427-0c54-4814-afcc-189f86207a0c" containerName="mariadb-database-create" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.229476 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a470b427-0c54-4814-afcc-189f86207a0c" containerName="mariadb-database-create" Dec 04 12:34:54 crc kubenswrapper[4760]: E1204 12:34:54.229520 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55a1d86-936b-439f-aef3-a2e55a596cd6" containerName="mariadb-account-create-update" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.229529 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55a1d86-936b-439f-aef3-a2e55a596cd6" containerName="mariadb-account-create-update" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.229807 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a470b427-0c54-4814-afcc-189f86207a0c" containerName="mariadb-database-create" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.229903 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55a1d86-936b-439f-aef3-a2e55a596cd6" containerName="mariadb-account-create-update" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.235115 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.243554 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.243996 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.244130 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jplzx" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.244265 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.280134 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.328792 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-config\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329027 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-scripts\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329245 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329571 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329695 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.329978 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkmq5\" (UniqueName: \"kubernetes.io/projected/fa2846c9-8951-497c-bcae-d186f8f62265-kube-api-access-wkmq5\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.431816 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.431885 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.431956 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.431984 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkmq5\" (UniqueName: \"kubernetes.io/projected/fa2846c9-8951-497c-bcae-d186f8f62265-kube-api-access-wkmq5\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.432048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-config\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.432101 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-scripts\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.432130 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.432737 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.435604 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-scripts\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.436488 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa2846c9-8951-497c-bcae-d186f8f62265-config\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.443029 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.443172 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.446413 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2846c9-8951-497c-bcae-d186f8f62265-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.455060 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkmq5\" (UniqueName: \"kubernetes.io/projected/fa2846c9-8951-497c-bcae-d186f8f62265-kube-api-access-wkmq5\") pod \"ovn-northd-0\" (UID: \"fa2846c9-8951-497c-bcae-d186f8f62265\") " pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.569158 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.608849 4760 generic.go:334] "Generic (PLEG): container finished" podID="b5b26f6a-29a2-4c94-8606-0b8d925489aa" containerID="b5cdd98a02b38ffb70b102faa8289a81b610f49b6f2876055aca2d540f66beb4" exitCode=0 Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.609322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e42-account-create-update-h6rgv" event={"ID":"b5b26f6a-29a2-4c94-8606-0b8d925489aa","Type":"ContainerDied","Data":"b5cdd98a02b38ffb70b102faa8289a81b610f49b6f2876055aca2d540f66beb4"} Dec 04 12:34:54 crc kubenswrapper[4760]: I1204 12:34:54.842349 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:34:54 crc kubenswrapper[4760]: E1204 12:34:54.842691 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:34:54 crc kubenswrapper[4760]: E1204 12:34:54.842714 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:34:54 crc kubenswrapper[4760]: E1204 12:34:54.842790 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:35:02.842762644 +0000 UTC m=+1305.884209211 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.084956 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wd987" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.148761 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts\") pod \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.149579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhf9n\" (UniqueName: \"kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n\") pod \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\" (UID: \"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.152558 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" (UID: "4e2e8b9f-43da-419b-8cf4-f96f3ef4c863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.161108 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n" (OuterVolumeSpecName: "kube-api-access-vhf9n") pod "4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" (UID: "4e2e8b9f-43da-419b-8cf4-f96f3ef4c863"). InnerVolumeSpecName "kube-api-access-vhf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.190641 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.208568 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.255468 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq6fc\" (UniqueName: \"kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc\") pod \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.255723 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5k4d\" (UniqueName: \"kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d\") pod \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.255769 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts\") pod \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\" (UID: \"2af69cbd-7106-43d8-9a09-e55a18ffa2bb\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.255891 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts\") pod \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\" (UID: \"a484ab2a-7803-4ce1-8ba6-32942c29c4d9\") " Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.256620 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhf9n\" (UniqueName: \"kubernetes.io/projected/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-kube-api-access-vhf9n\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.256647 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.256762 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af69cbd-7106-43d8-9a09-e55a18ffa2bb" (UID: "2af69cbd-7106-43d8-9a09-e55a18ffa2bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.258092 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a484ab2a-7803-4ce1-8ba6-32942c29c4d9" (UID: "a484ab2a-7803-4ce1-8ba6-32942c29c4d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.263750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d" (OuterVolumeSpecName: "kube-api-access-s5k4d") pod "a484ab2a-7803-4ce1-8ba6-32942c29c4d9" (UID: "a484ab2a-7803-4ce1-8ba6-32942c29c4d9"). InnerVolumeSpecName "kube-api-access-s5k4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.267905 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc" (OuterVolumeSpecName: "kube-api-access-hq6fc") pod "2af69cbd-7106-43d8-9a09-e55a18ffa2bb" (UID: "2af69cbd-7106-43d8-9a09-e55a18ffa2bb"). InnerVolumeSpecName "kube-api-access-hq6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.310608 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.358795 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq6fc\" (UniqueName: \"kubernetes.io/projected/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-kube-api-access-hq6fc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.358859 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5k4d\" (UniqueName: \"kubernetes.io/projected/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-kube-api-access-s5k4d\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.358870 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af69cbd-7106-43d8-9a09-e55a18ffa2bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.358882 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a484ab2a-7803-4ce1-8ba6-32942c29c4d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.624816 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnp4l" event={"ID":"a484ab2a-7803-4ce1-8ba6-32942c29c4d9","Type":"ContainerDied","Data":"235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360"} Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.624848 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnp4l" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.624878 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235ce3781b86088df007711a074c40a43c66bf33c1e2b8c7a535431a6d20a360" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.627992 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wd987" event={"ID":"4e2e8b9f-43da-419b-8cf4-f96f3ef4c863","Type":"ContainerDied","Data":"eb06769711f6ba8369810948a656f9c802afec2365c502d0d53d34049256cebe"} Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.628027 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb06769711f6ba8369810948a656f9c802afec2365c502d0d53d34049256cebe" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.628030 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wd987" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.630797 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6133-account-create-update-gpcjt" event={"ID":"2af69cbd-7106-43d8-9a09-e55a18ffa2bb","Type":"ContainerDied","Data":"13fd9c115bbf47731e633f8ccd774d2a12a68fb450981d428a183d5b3afa543d"} Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.630880 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13fd9c115bbf47731e633f8ccd774d2a12a68fb450981d428a183d5b3afa543d" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.630997 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6133-account-create-update-gpcjt" Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.638966 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa2846c9-8951-497c-bcae-d186f8f62265","Type":"ContainerStarted","Data":"3b498f5fc9d1bb4ed87d9411ccdeb9224d14d0611995744c1c2b090ea10242ad"} Dec 04 12:34:55 crc kubenswrapper[4760]: I1204 12:34:55.797161 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.089904 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.182158 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts\") pod \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.182277 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn5x6\" (UniqueName: \"kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6\") pod \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\" (UID: \"b5b26f6a-29a2-4c94-8606-0b8d925489aa\") " Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.183107 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5b26f6a-29a2-4c94-8606-0b8d925489aa" (UID: "b5b26f6a-29a2-4c94-8606-0b8d925489aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.190360 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6" (OuterVolumeSpecName: "kube-api-access-zn5x6") pod "b5b26f6a-29a2-4c94-8606-0b8d925489aa" (UID: "b5b26f6a-29a2-4c94-8606-0b8d925489aa"). InnerVolumeSpecName "kube-api-access-zn5x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.285335 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5b26f6a-29a2-4c94-8606-0b8d925489aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.285402 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn5x6\" (UniqueName: \"kubernetes.io/projected/b5b26f6a-29a2-4c94-8606-0b8d925489aa-kube-api-access-zn5x6\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.589530 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.726466 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e42-account-create-update-h6rgv" event={"ID":"b5b26f6a-29a2-4c94-8606-0b8d925489aa","Type":"ContainerDied","Data":"009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d"} Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.726527 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009605c3a6670802929a6ebd436042f0ace75fbcb9a2fad17ab5ff6eaff5a99d" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.726611 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e42-account-create-update-h6rgv" Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.733678 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:34:56 crc kubenswrapper[4760]: I1204 12:34:56.734040 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="dnsmasq-dns" containerID="cri-o://011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a" gracePeriod=10 Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.343625 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.422795 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb\") pod \"dbe589a8-e8b2-4c34-913d-cc529e207984\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.422998 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config\") pod \"dbe589a8-e8b2-4c34-913d-cc529e207984\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.423031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc\") pod \"dbe589a8-e8b2-4c34-913d-cc529e207984\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.425267 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gfhq\" (UniqueName: \"kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq\") pod \"dbe589a8-e8b2-4c34-913d-cc529e207984\" (UID: \"dbe589a8-e8b2-4c34-913d-cc529e207984\") " Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.430346 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq" (OuterVolumeSpecName: "kube-api-access-4gfhq") pod "dbe589a8-e8b2-4c34-913d-cc529e207984" (UID: "dbe589a8-e8b2-4c34-913d-cc529e207984"). InnerVolumeSpecName "kube-api-access-4gfhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.476444 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbe589a8-e8b2-4c34-913d-cc529e207984" (UID: "dbe589a8-e8b2-4c34-913d-cc529e207984"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.477290 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbe589a8-e8b2-4c34-913d-cc529e207984" (UID: "dbe589a8-e8b2-4c34-913d-cc529e207984"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.478267 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config" (OuterVolumeSpecName: "config") pod "dbe589a8-e8b2-4c34-913d-cc529e207984" (UID: "dbe589a8-e8b2-4c34-913d-cc529e207984"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.529997 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.530059 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.530081 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gfhq\" (UniqueName: \"kubernetes.io/projected/dbe589a8-e8b2-4c34-913d-cc529e207984-kube-api-access-4gfhq\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.530096 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbe589a8-e8b2-4c34-913d-cc529e207984-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.741693 4760 generic.go:334] "Generic (PLEG): container finished" podID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerID="011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a" exitCode=0 Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.741790 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" event={"ID":"dbe589a8-e8b2-4c34-913d-cc529e207984","Type":"ContainerDied","Data":"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a"} Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.741832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" event={"ID":"dbe589a8-e8b2-4c34-913d-cc529e207984","Type":"ContainerDied","Data":"b853ae26f73fb50f4628edb7046f13ebc4c993fee91da76ad4cababc0f2415aa"} Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.741855 4760 scope.go:117] "RemoveContainer" containerID="011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.741886 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ws7vm" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.746808 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa2846c9-8951-497c-bcae-d186f8f62265","Type":"ContainerStarted","Data":"29b051122c3bcddc7fc795740c284d3357ca4fdb3d177da9c177befef5d7de16"} Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.746876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fa2846c9-8951-497c-bcae-d186f8f62265","Type":"ContainerStarted","Data":"a941ebad8c0fa78b80df22a177815f92c521607fe51b8745ccf96d2ddbcb5528"} Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.747156 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.781628 4760 scope.go:117] "RemoveContainer" containerID="e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.781881 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.607271785 podStartE2EDuration="3.78184389s" podCreationTimestamp="2025-12-04 12:34:54 +0000 UTC" firstStartedPulling="2025-12-04 12:34:55.324305209 +0000 UTC m=+1298.365751776" lastFinishedPulling="2025-12-04 12:34:56.498877314 +0000 UTC m=+1299.540323881" observedRunningTime="2025-12-04 12:34:57.776203 +0000 UTC m=+1300.817649567" watchObservedRunningTime="2025-12-04 12:34:57.78184389 +0000 UTC m=+1300.823290457" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.820849 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.827706 4760 scope.go:117] "RemoveContainer" containerID="011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a" Dec 04 12:34:57 crc kubenswrapper[4760]: E1204 12:34:57.828629 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a\": container with ID starting with 011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a not found: ID does not exist" containerID="011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.828682 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a"} err="failed to get container status \"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a\": rpc error: code = NotFound desc = could not find container \"011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a\": container with ID starting with 011626138acd7e20db8270f973a2c5db684c1e8107d37d25d5b4ea030544123a not found: ID does not exist" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.828717 4760 scope.go:117] "RemoveContainer" containerID="e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7" Dec 04 12:34:57 crc kubenswrapper[4760]: E1204 12:34:57.829113 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7\": container with ID starting with e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7 not found: ID does not exist" containerID="e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.829133 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7"} err="failed to get container status \"e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7\": rpc error: code = NotFound desc = could not find container \"e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7\": container with ID starting with e2bb8001aefd0c0294e30ea4445b86dcf4a4f35b45ada674f12bacf508f8e0d7 not found: ID does not exist" Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.830614 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ws7vm"] Dec 04 12:34:57 crc kubenswrapper[4760]: I1204 12:34:57.884452 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" path="/var/lib/kubelet/pods/dbe589a8-e8b2-4c34-913d-cc529e207984/volumes" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138088 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vmvd8"] Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138688 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b26f6a-29a2-4c94-8606-0b8d925489aa" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138711 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b26f6a-29a2-4c94-8606-0b8d925489aa" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138738 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af69cbd-7106-43d8-9a09-e55a18ffa2bb" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138752 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af69cbd-7106-43d8-9a09-e55a18ffa2bb" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138787 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="init" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138799 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="init" Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138814 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="dnsmasq-dns" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138821 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="dnsmasq-dns" Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138834 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138845 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: E1204 12:34:58.138855 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a484ab2a-7803-4ce1-8ba6-32942c29c4d9" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.138863 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a484ab2a-7803-4ce1-8ba6-32942c29c4d9" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.139150 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe589a8-e8b2-4c34-913d-cc529e207984" containerName="dnsmasq-dns" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.139186 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af69cbd-7106-43d8-9a09-e55a18ffa2bb" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.139232 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b26f6a-29a2-4c94-8606-0b8d925489aa" containerName="mariadb-account-create-update" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.139251 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.139263 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a484ab2a-7803-4ce1-8ba6-32942c29c4d9" containerName="mariadb-database-create" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.140489 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.144314 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mvczn" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.147990 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.155743 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vmvd8"] Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.327087 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wtz\" (UniqueName: \"kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.327242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.327420 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.327721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.430524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.430794 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.430918 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.431029 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wtz\" (UniqueName: \"kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.437048 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.437489 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.437719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.458593 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wtz\" (UniqueName: \"kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz\") pod \"glance-db-sync-vmvd8\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:58 crc kubenswrapper[4760]: I1204 12:34:58.459341 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vmvd8" Dec 04 12:34:59 crc kubenswrapper[4760]: I1204 12:34:59.105613 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vmvd8"] Dec 04 12:34:59 crc kubenswrapper[4760]: I1204 12:34:59.775026 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vmvd8" event={"ID":"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74","Type":"ContainerStarted","Data":"dcfacd09363d863e4bc606f4e14e070b19677bd09aace4213f26ee90b17f4af8"} Dec 04 12:35:02 crc kubenswrapper[4760]: I1204 12:35:02.851333 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:35:02 crc kubenswrapper[4760]: E1204 12:35:02.851869 4760 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 12:35:02 crc kubenswrapper[4760]: E1204 12:35:02.851888 4760 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 12:35:02 crc kubenswrapper[4760]: E1204 12:35:02.851956 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift podName:ad805440-7d12-4b3e-b11b-c37463e95bb7 nodeName:}" failed. No retries permitted until 2025-12-04 12:35:18.85193436 +0000 UTC m=+1321.893380927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift") pod "swift-storage-0" (UID: "ad805440-7d12-4b3e-b11b-c37463e95bb7") : configmap "swift-ring-files" not found Dec 04 12:35:03 crc kubenswrapper[4760]: I1204 12:35:03.820098 4760 generic.go:334] "Generic (PLEG): container finished" podID="4ce3174d-015c-4a85-b58d-af7603479902" containerID="c60de42ac34004b15b2af639872103dddc9d22c78376f01c2f9c4dc0c121aa02" exitCode=0 Dec 04 12:35:03 crc kubenswrapper[4760]: I1204 12:35:03.820174 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m4x4d" event={"ID":"4ce3174d-015c-4a85-b58d-af7603479902","Type":"ContainerDied","Data":"c60de42ac34004b15b2af639872103dddc9d22c78376f01c2f9c4dc0c121aa02"} Dec 04 12:35:05 crc kubenswrapper[4760]: I1204 12:35:05.998034 4760 generic.go:334] "Generic (PLEG): container finished" podID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerID="59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc" exitCode=0 Dec 04 12:35:06 crc kubenswrapper[4760]: I1204 12:35:05.998153 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerDied","Data":"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc"} Dec 04 12:35:06 crc kubenswrapper[4760]: I1204 12:35:06.005229 4760 generic.go:334] "Generic (PLEG): container finished" podID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerID="c643d91bc9dc7bde5a3c757a5b02bc7fd000256f842eb4fe2df55b32bc742ec2" exitCode=0 Dec 04 12:35:06 crc kubenswrapper[4760]: I1204 12:35:06.005350 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerDied","Data":"c643d91bc9dc7bde5a3c757a5b02bc7fd000256f842eb4fe2df55b32bc742ec2"} Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.295779 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.296432 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-76hxp" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.615309 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-frcmm-config-mdwg9"] Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.618840 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.632231 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-frcmm-config-mdwg9"] Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.636014 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.713463 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.713611 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.713682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.713925 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.714040 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv7w7\" (UniqueName: \"kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.714200 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817043 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817146 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817377 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817421 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv7w7\" (UniqueName: \"kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817487 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.817808 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.818001 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.818203 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.820416 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.820721 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.848768 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv7w7\" (UniqueName: \"kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7\") pod \"ovn-controller-frcmm-config-mdwg9\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:07 crc kubenswrapper[4760]: I1204 12:35:07.951282 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:09 crc kubenswrapper[4760]: I1204 12:35:09.651412 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 12:35:17 crc kubenswrapper[4760]: E1204 12:35:17.259447 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 04 12:35:17 crc kubenswrapper[4760]: E1204 12:35:17.260868 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2wtz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-vmvd8_openstack(2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:35:17 crc kubenswrapper[4760]: E1204 12:35:17.262370 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-vmvd8" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.438323 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.612659 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.612814 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.612950 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.613077 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpcsg\" (UniqueName: \"kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.613114 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.613146 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.613262 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts\") pod \"4ce3174d-015c-4a85-b58d-af7603479902\" (UID: \"4ce3174d-015c-4a85-b58d-af7603479902\") " Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.614443 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.615422 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.624688 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg" (OuterVolumeSpecName: "kube-api-access-qpcsg") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "kube-api-access-qpcsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.625252 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.646155 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.649633 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts" (OuterVolumeSpecName: "scripts") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.649914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce3174d-015c-4a85-b58d-af7603479902" (UID: "4ce3174d-015c-4a85-b58d-af7603479902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.716267 4760 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ce3174d-015c-4a85-b58d-af7603479902-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.716775 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpcsg\" (UniqueName: \"kubernetes.io/projected/4ce3174d-015c-4a85-b58d-af7603479902-kube-api-access-qpcsg\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.716860 4760 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.716924 4760 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.716989 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.717060 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce3174d-015c-4a85-b58d-af7603479902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.717154 4760 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ce3174d-015c-4a85-b58d-af7603479902-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.773612 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-frcmm-config-mdwg9"] Dec 04 12:35:17 crc kubenswrapper[4760]: W1204 12:35:17.784027 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077b5cdd_b8d6_4f81_b466_e73b784cb88b.slice/crio-91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6 WatchSource:0}: Error finding container 91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6: Status 404 returned error can't find the container with id 91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6 Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.785283 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-frcmm" podUID="52334746-99b6-4056-a7d7-6df95b72d8de" containerName="ovn-controller" probeResult="failure" output=< Dec 04 12:35:17 crc kubenswrapper[4760]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 12:35:17 crc kubenswrapper[4760]: > Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.952023 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerStarted","Data":"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9"} Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.952826 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.961332 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerStarted","Data":"0b5af118690eb894d7059f40d968e970ae6c204dc44afb15b8d6ec0656baa2ce"} Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.961673 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.965097 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-frcmm-config-mdwg9" event={"ID":"077b5cdd-b8d6-4f81-b466-e73b784cb88b","Type":"ContainerStarted","Data":"91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6"} Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.967850 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m4x4d" event={"ID":"4ce3174d-015c-4a85-b58d-af7603479902","Type":"ContainerDied","Data":"5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939"} Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.967994 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m4x4d" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.968645 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e229010f20a39d940f3306b40a342126b62bc739335d7613100ac88e2d89939" Dec 04 12:35:17 crc kubenswrapper[4760]: E1204 12:35:17.974430 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-vmvd8" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" Dec 04 12:35:17 crc kubenswrapper[4760]: I1204 12:35:17.987502 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.313421195 podStartE2EDuration="1m42.987469168s" podCreationTimestamp="2025-12-04 12:33:35 +0000 UTC" firstStartedPulling="2025-12-04 12:33:38.161761724 +0000 UTC m=+1221.203208291" lastFinishedPulling="2025-12-04 12:34:30.835809697 +0000 UTC m=+1273.877256264" observedRunningTime="2025-12-04 12:35:17.980962751 +0000 UTC m=+1321.022409338" watchObservedRunningTime="2025-12-04 12:35:17.987469168 +0000 UTC m=+1321.028915735" Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.024883 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.393907101 podStartE2EDuration="1m42.024845584s" podCreationTimestamp="2025-12-04 12:33:36 +0000 UTC" firstStartedPulling="2025-12-04 12:33:39.351410716 +0000 UTC m=+1222.392857283" lastFinishedPulling="2025-12-04 12:34:30.982349189 +0000 UTC m=+1274.023795766" observedRunningTime="2025-12-04 12:35:18.019119382 +0000 UTC m=+1321.060566049" watchObservedRunningTime="2025-12-04 12:35:18.024845584 +0000 UTC m=+1321.066292151" Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.853447 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.867381 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ad805440-7d12-4b3e-b11b-c37463e95bb7-etc-swift\") pod \"swift-storage-0\" (UID: \"ad805440-7d12-4b3e-b11b-c37463e95bb7\") " pod="openstack/swift-storage-0" Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.878016 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.983620 4760 generic.go:334] "Generic (PLEG): container finished" podID="077b5cdd-b8d6-4f81-b466-e73b784cb88b" containerID="852c2641d49371e30153bafa2185d27d17baaf3c934ac2bc2f501a409ec34586" exitCode=0 Dec 04 12:35:18 crc kubenswrapper[4760]: I1204 12:35:18.983781 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-frcmm-config-mdwg9" event={"ID":"077b5cdd-b8d6-4f81-b466-e73b784cb88b","Type":"ContainerDied","Data":"852c2641d49371e30153bafa2185d27d17baaf3c934ac2bc2f501a409ec34586"} Dec 04 12:35:19 crc kubenswrapper[4760]: I1204 12:35:19.751026 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 12:35:19 crc kubenswrapper[4760]: I1204 12:35:19.996284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"0e61d73579c99ce6909e7127e5afc8d53d423e6c99e5de71210adace9081b9a4"} Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.346612 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.495617 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.495821 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.495920 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.495954 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv7w7\" (UniqueName: \"kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.495980 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.496123 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn\") pod \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\" (UID: \"077b5cdd-b8d6-4f81-b466-e73b784cb88b\") " Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.496667 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run" (OuterVolumeSpecName: "var-run") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.496774 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.496778 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.497284 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.497676 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts" (OuterVolumeSpecName: "scripts") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.521742 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7" (OuterVolumeSpecName: "kube-api-access-mv7w7") pod "077b5cdd-b8d6-4f81-b466-e73b784cb88b" (UID: "077b5cdd-b8d6-4f81-b466-e73b784cb88b"). InnerVolumeSpecName "kube-api-access-mv7w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598590 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598653 4760 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598666 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv7w7\" (UniqueName: \"kubernetes.io/projected/077b5cdd-b8d6-4f81-b466-e73b784cb88b-kube-api-access-mv7w7\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598681 4760 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598695 4760 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/077b5cdd-b8d6-4f81-b466-e73b784cb88b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: I1204 12:35:20.598708 4760 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/077b5cdd-b8d6-4f81-b466-e73b784cb88b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:20 crc kubenswrapper[4760]: E1204 12:35:20.923035 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.011595 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-frcmm-config-mdwg9" event={"ID":"077b5cdd-b8d6-4f81-b466-e73b784cb88b","Type":"ContainerDied","Data":"91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6"} Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.011677 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c4b596c251ace8e5a82cbed43705bbe50810e9faa3bf02a16a67ad00fc03a6" Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.011784 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-frcmm-config-mdwg9" Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.523327 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-frcmm-config-mdwg9"] Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.533687 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-frcmm-config-mdwg9"] Dec 04 12:35:21 crc kubenswrapper[4760]: I1204 12:35:21.880070 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077b5cdd-b8d6-4f81-b466-e73b784cb88b" path="/var/lib/kubelet/pods/077b5cdd-b8d6-4f81-b466-e73b784cb88b/volumes" Dec 04 12:35:22 crc kubenswrapper[4760]: I1204 12:35:22.027113 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"e1e775e7c98b25ec2e5f8636982ece69255f43ddaf1fa51c6f1a8a46765b9448"} Dec 04 12:35:22 crc kubenswrapper[4760]: I1204 12:35:22.027182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"64a052dbc9182e5a96c7415a2e168cf814e7e5f34640a470f198122e7187389c"} Dec 04 12:35:22 crc kubenswrapper[4760]: I1204 12:35:22.027196 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"1562a0d4e18720ec152f5100475d9797a4ac89c1339bb6022eaa7b369161d90c"} Dec 04 12:35:22 crc kubenswrapper[4760]: I1204 12:35:22.027210 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"3e153c9d8b7bddac64680466e39c6baaa88c14b400ae85186d5953becb39ae6b"} Dec 04 12:35:22 crc kubenswrapper[4760]: I1204 12:35:22.790322 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-frcmm" Dec 04 12:35:24 crc kubenswrapper[4760]: I1204 12:35:24.062755 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"ad598c79a559f4cf13e30cc519275897f66da33e781defdd06d29ed071cca3ab"} Dec 04 12:35:24 crc kubenswrapper[4760]: I1204 12:35:24.063401 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"f0925b2094cd30a53aeaf4bd44f801dc2572d74881c06f60ec2da100f17f7d08"} Dec 04 12:35:24 crc kubenswrapper[4760]: I1204 12:35:24.063419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"959cdf937d1d7d9b7a7e248e89674743cfb1f20d1ee4113f6bcec0534e50521f"} Dec 04 12:35:25 crc kubenswrapper[4760]: I1204 12:35:25.092578 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"31159b58de373e4f217082db7522fb8b862445307157e93720fc427acdca35b5"} Dec 04 12:35:26 crc kubenswrapper[4760]: I1204 12:35:26.107112 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"ae7dd8ef658feb5e5d0c4a74e33b497de6436cb18f6ab2a8b9e5863aeb0ee6ec"} Dec 04 12:35:26 crc kubenswrapper[4760]: I1204 12:35:26.107651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"61d6686033e8d9a06858f0402512f3ee729cc63fc03935517683cc68247662e7"} Dec 04 12:35:27 crc kubenswrapper[4760]: I1204 12:35:27.071037 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 04 12:35:27 crc kubenswrapper[4760]: I1204 12:35:27.128990 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"b9bee69caf76fcdf05c217284e582664f6cb93cc382f6a5e5bf0592858291b29"} Dec 04 12:35:27 crc kubenswrapper[4760]: I1204 12:35:27.129063 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"a6d219d90d498758f9c40e03d96b0fa7b444aa2b828d5382e6da3d1c8c655682"} Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.005210 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.145725 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"55469d0a6be51f4354090b2bcaea9d4ffa39236dd5b12beaf93746851bddc993"} Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.146738 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"f715e11f2260a73be0885497852d24219c8d55f79ee5b4b882dea9e099a24c13"} Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.146843 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ad805440-7d12-4b3e-b11b-c37463e95bb7","Type":"ContainerStarted","Data":"a2ec3ac3fb5aad78147739f2532834609ec8b303f0d767a4adbbf6bfcca8b2cf"} Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.198880 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.344643347 podStartE2EDuration="43.198841589s" podCreationTimestamp="2025-12-04 12:34:45 +0000 UTC" firstStartedPulling="2025-12-04 12:35:19.778523491 +0000 UTC m=+1322.819970058" lastFinishedPulling="2025-12-04 12:35:25.632721733 +0000 UTC m=+1328.674168300" observedRunningTime="2025-12-04 12:35:28.194807262 +0000 UTC m=+1331.236253869" watchObservedRunningTime="2025-12-04 12:35:28.198841589 +0000 UTC m=+1331.240288156" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.590966 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:35:28 crc kubenswrapper[4760]: E1204 12:35:28.591726 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077b5cdd-b8d6-4f81-b466-e73b784cb88b" containerName="ovn-config" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.591756 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="077b5cdd-b8d6-4f81-b466-e73b784cb88b" containerName="ovn-config" Dec 04 12:35:28 crc kubenswrapper[4760]: E1204 12:35:28.591780 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce3174d-015c-4a85-b58d-af7603479902" containerName="swift-ring-rebalance" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.591789 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce3174d-015c-4a85-b58d-af7603479902" containerName="swift-ring-rebalance" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.591998 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="077b5cdd-b8d6-4f81-b466-e73b784cb88b" containerName="ovn-config" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.592020 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce3174d-015c-4a85-b58d-af7603479902" containerName="swift-ring-rebalance" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.593639 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.611768 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.613102 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.764304 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94svf\" (UniqueName: \"kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.764725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.764901 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.764979 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.765576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.765668 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.869774 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.867966 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.869950 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.870766 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.871484 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.872127 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.872749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.872808 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.872865 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94svf\" (UniqueName: \"kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.874204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.874522 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.901677 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94svf\" (UniqueName: \"kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf\") pod \"dnsmasq-dns-764c5664d7-tvztj\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:28 crc kubenswrapper[4760]: I1204 12:35:28.947140 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:29 crc kubenswrapper[4760]: I1204 12:35:29.510183 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:35:30 crc kubenswrapper[4760]: I1204 12:35:30.172544 4760 generic.go:334] "Generic (PLEG): container finished" podID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerID="ad28aaa019017eb682b8f12a7dd1c734a8e583ad48922b74784c97de851d0a29" exitCode=0 Dec 04 12:35:30 crc kubenswrapper[4760]: I1204 12:35:30.172684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" event={"ID":"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213","Type":"ContainerDied","Data":"ad28aaa019017eb682b8f12a7dd1c734a8e583ad48922b74784c97de851d0a29"} Dec 04 12:35:30 crc kubenswrapper[4760]: I1204 12:35:30.173109 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" event={"ID":"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213","Type":"ContainerStarted","Data":"9c600dd39c97a26ec7f0ba8cd2138d2fada28c72a86ff038d32acaeaccdd8dfe"} Dec 04 12:35:31 crc kubenswrapper[4760]: I1204 12:35:31.189282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vmvd8" event={"ID":"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74","Type":"ContainerStarted","Data":"0f91f22ee3cee8f1e541180a1829532b03212e89dddd8bb0c41db66a172d6964"} Dec 04 12:35:31 crc kubenswrapper[4760]: E1204 12:35:31.199020 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:35:31 crc kubenswrapper[4760]: I1204 12:35:31.200022 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" event={"ID":"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213","Type":"ContainerStarted","Data":"7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b"} Dec 04 12:35:31 crc kubenswrapper[4760]: I1204 12:35:31.200826 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:31 crc kubenswrapper[4760]: I1204 12:35:31.239900 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vmvd8" podStartSLOduration=2.062653539 podStartE2EDuration="33.239872331s" podCreationTimestamp="2025-12-04 12:34:58 +0000 UTC" firstStartedPulling="2025-12-04 12:34:59.116790735 +0000 UTC m=+1302.158237302" lastFinishedPulling="2025-12-04 12:35:30.294009517 +0000 UTC m=+1333.335456094" observedRunningTime="2025-12-04 12:35:31.217712688 +0000 UTC m=+1334.259159255" watchObservedRunningTime="2025-12-04 12:35:31.239872331 +0000 UTC m=+1334.281318898" Dec 04 12:35:37 crc kubenswrapper[4760]: I1204 12:35:37.069927 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 12:35:37 crc kubenswrapper[4760]: I1204 12:35:37.123168 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" podStartSLOduration=9.123124324 podStartE2EDuration="9.123124324s" podCreationTimestamp="2025-12-04 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:31.258071809 +0000 UTC m=+1334.299518396" watchObservedRunningTime="2025-12-04 12:35:37.123124324 +0000 UTC m=+1340.164570911" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.056964 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.106630 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b62a-account-create-update-nv688"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.116946 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.125796 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.130227 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b62a-account-create-update-nv688"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.189457 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-xc9fh"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.191126 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.217469 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xc9fh"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.276877 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wc247"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.278799 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.300421 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mqd\" (UniqueName: \"kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.300537 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.300575 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdcv\" (UniqueName: \"kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.300648 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.343686 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wc247"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.359142 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wwp5d"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.360740 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.365621 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.367881 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.368166 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.376185 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnvh9" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.378936 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-705f-account-create-update-8dpgv"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.380968 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.386126 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405510 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405596 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mqd\" (UniqueName: \"kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405636 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgz9m\" (UniqueName: \"kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405691 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405733 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdcv\" (UniqueName: \"kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.405795 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.406851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.407510 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.408431 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wwp5d"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.418039 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-705f-account-create-update-8dpgv"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.427200 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-dn8jg"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.429081 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.446995 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-dn8jg"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.475261 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdcv\" (UniqueName: \"kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv\") pod \"barbican-b62a-account-create-update-nv688\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.475261 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mqd\" (UniqueName: \"kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd\") pod \"barbican-db-create-xc9fh\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.508604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509227 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlw9\" (UniqueName: \"kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509306 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509344 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlnjc\" (UniqueName: \"kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509402 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509421 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgz9m\" (UniqueName: \"kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.509539 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2s2c\" (UniqueName: \"kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.513265 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.537418 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.549058 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgz9m\" (UniqueName: \"kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m\") pod \"cinder-db-create-wc247\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.602516 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wc247" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612612 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlw9\" (UniqueName: \"kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612715 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612782 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlnjc\" (UniqueName: \"kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612821 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612870 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612921 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2s2c\" (UniqueName: \"kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.612960 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.613345 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-25bph"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.614065 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.614911 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.615278 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.620915 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.630957 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.663093 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlw9\" (UniqueName: \"kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9\") pod \"manila-705f-account-create-update-8dpgv\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.671295 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-25bph"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.682303 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2s2c\" (UniqueName: \"kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c\") pod \"manila-db-create-dn8jg\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.686460 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlnjc\" (UniqueName: \"kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc\") pod \"keystone-db-sync-wwp5d\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.700326 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.715283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzzt\" (UniqueName: \"kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.715460 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.720012 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.725273 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-cf87-account-create-update-8lng5"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.727180 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.734106 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cf87-account-create-update-8lng5"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.736687 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.746897 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.761962 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.820955 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzzt\" (UniqueName: \"kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.821098 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.821180 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.821273 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmh7\" (UniqueName: \"kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.822014 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.883451 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzzt\" (UniqueName: \"kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt\") pod \"neutron-db-create-25bph\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " pod="openstack/neutron-db-create-25bph" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.918395 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b74a-account-create-update-tcwmk"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.920560 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.923194 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.923407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmh7\" (UniqueName: \"kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.924563 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.925797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.933534 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b74a-account-create-update-tcwmk"] Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.950813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmh7\" (UniqueName: \"kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7\") pod \"cinder-cf87-account-create-update-8lng5\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:38 crc kubenswrapper[4760]: I1204 12:35:38.953507 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.026711 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.026800 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ffw\" (UniqueName: \"kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.063852 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-25bph" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.080073 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.087948 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.088343 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-f9wkq" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="dnsmasq-dns" containerID="cri-o://818e5deaa954e765ce9d8c324af281c2674f1d35350b23f6c19209721e99294e" gracePeriod=10 Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.129383 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.130066 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ffw\" (UniqueName: \"kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.130149 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.162840 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ffw\" (UniqueName: \"kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw\") pod \"neutron-b74a-account-create-update-tcwmk\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.269652 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.354731 4760 generic.go:334] "Generic (PLEG): container finished" podID="f65c3674-23c1-453d-8439-079822a3eb3c" containerID="818e5deaa954e765ce9d8c324af281c2674f1d35350b23f6c19209721e99294e" exitCode=0 Dec 04 12:35:39 crc kubenswrapper[4760]: I1204 12:35:39.354803 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f9wkq" event={"ID":"f65c3674-23c1-453d-8439-079822a3eb3c","Type":"ContainerDied","Data":"818e5deaa954e765ce9d8c324af281c2674f1d35350b23f6c19209721e99294e"} Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.106753 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xc9fh"] Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.796190 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.813530 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b62a-account-create-update-nv688"] Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.818085 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-f9wkq" event={"ID":"f65c3674-23c1-453d-8439-079822a3eb3c","Type":"ContainerDied","Data":"df49979df03411a4afc00f507c9afa7b31708ce8198484f890a2b28276bc87cf"} Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.818161 4760 scope.go:117] "RemoveContainer" containerID="818e5deaa954e765ce9d8c324af281c2674f1d35350b23f6c19209721e99294e" Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.818359 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-f9wkq" Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.821478 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xc9fh" event={"ID":"34e68130-6cfc-4349-9ca6-1eaa2690e632","Type":"ContainerStarted","Data":"c56ec5a142151c1a5ed18dc16810923a773ffb67a562af98b09877c6015b93a2"} Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.826997 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wc247"] Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.877275 4760 scope.go:117] "RemoveContainer" containerID="417e2bf2d1e0adc914a049497ab436b90e940a51b5a4ab6b6cea7f6b9a516e6d" Dec 04 12:35:40 crc kubenswrapper[4760]: W1204 12:35:40.890623 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e75bad_e5f1_4db7_abd4_b64a956a01bd.slice/crio-70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d WatchSource:0}: Error finding container 70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d: Status 404 returned error can't find the container with id 70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.911890 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wwp5d"] Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.955733 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb\") pod \"f65c3674-23c1-453d-8439-079822a3eb3c\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.955931 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwsg\" (UniqueName: \"kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg\") pod \"f65c3674-23c1-453d-8439-079822a3eb3c\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.956003 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config\") pod \"f65c3674-23c1-453d-8439-079822a3eb3c\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.956051 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc\") pod \"f65c3674-23c1-453d-8439-079822a3eb3c\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.956159 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb\") pod \"f65c3674-23c1-453d-8439-079822a3eb3c\" (UID: \"f65c3674-23c1-453d-8439-079822a3eb3c\") " Dec 04 12:35:40 crc kubenswrapper[4760]: I1204 12:35:40.968225 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg" (OuterVolumeSpecName: "kube-api-access-bxwsg") pod "f65c3674-23c1-453d-8439-079822a3eb3c" (UID: "f65c3674-23c1-453d-8439-079822a3eb3c"). InnerVolumeSpecName "kube-api-access-bxwsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.059116 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwsg\" (UniqueName: \"kubernetes.io/projected/f65c3674-23c1-453d-8439-079822a3eb3c-kube-api-access-bxwsg\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.070588 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f65c3674-23c1-453d-8439-079822a3eb3c" (UID: "f65c3674-23c1-453d-8439-079822a3eb3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.088391 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f65c3674-23c1-453d-8439-079822a3eb3c" (UID: "f65c3674-23c1-453d-8439-079822a3eb3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.090547 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-25bph"] Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.102928 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-705f-account-create-update-8dpgv"] Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.109812 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config" (OuterVolumeSpecName: "config") pod "f65c3674-23c1-453d-8439-079822a3eb3c" (UID: "f65c3674-23c1-453d-8439-079822a3eb3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.113509 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f65c3674-23c1-453d-8439-079822a3eb3c" (UID: "f65c3674-23c1-453d-8439-079822a3eb3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.129608 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-dn8jg"] Dec 04 12:35:41 crc kubenswrapper[4760]: W1204 12:35:41.164888 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0642bc5b_899a_4334_80ff_4eac919be523.slice/crio-660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352 WatchSource:0}: Error finding container 660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352: Status 404 returned error can't find the container with id 660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352 Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.167903 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.167947 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.167961 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.167971 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65c3674-23c1-453d-8439-079822a3eb3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.211745 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.220703 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-f9wkq"] Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.297809 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cf87-account-create-update-8lng5"] Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.308393 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b74a-account-create-update-tcwmk"] Dec 04 12:35:41 crc kubenswrapper[4760]: W1204 12:35:41.321497 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecaeba91_3e2d_404d_97fc_d43be6e0ac06.slice/crio-51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575 WatchSource:0}: Error finding container 51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575: Status 404 returned error can't find the container with id 51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575 Dec 04 12:35:41 crc kubenswrapper[4760]: E1204 12:35:41.693296 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e68130_6cfc_4349_9ca6_1eaa2690e632.slice/crio-06922d91c6e7d0cddbbd78fa5d91ff6b27d98a19fa8acf6ae0e67d00590d1eed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34e68130_6cfc_4349_9ca6_1eaa2690e632.slice/crio-conmon-06922d91c6e7d0cddbbd78fa5d91ff6b27d98a19fa8acf6ae0e67d00590d1eed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2bfa70a_49e1_4083_80fd_8e32e354de04.slice/crio-conmon-3ec7550ef786a66bca683725bebc9dc87d1faceb96ba0c147f4d36051b39dd27.scope\": RecentStats: unable to find data in memory cache]" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.856153 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cf87-account-create-update-8lng5" event={"ID":"e5367e8d-0a58-4d48-aada-d676ca7f78a0","Type":"ContainerStarted","Data":"8eb75542bba557f19249684fb93dcca1b02bf49a5419db18cbf19c08cdb43f48"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.856456 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cf87-account-create-update-8lng5" event={"ID":"e5367e8d-0a58-4d48-aada-d676ca7f78a0","Type":"ContainerStarted","Data":"692a89400c4ef5004db86509c74795ffaa796b6566a12c44271d05a80e519c5b"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.877689 4760 generic.go:334] "Generic (PLEG): container finished" podID="f2bfa70a-49e1-4083-80fd-8e32e354de04" containerID="3ec7550ef786a66bca683725bebc9dc87d1faceb96ba0c147f4d36051b39dd27" exitCode=0 Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.899836 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-cf87-account-create-update-8lng5" podStartSLOduration=3.89978985 podStartE2EDuration="3.89978985s" podCreationTimestamp="2025-12-04 12:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:41.892830299 +0000 UTC m=+1344.934276866" watchObservedRunningTime="2025-12-04 12:35:41.89978985 +0000 UTC m=+1344.941236417" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.926700 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" path="/var/lib/kubelet/pods/f65c3674-23c1-453d-8439-079822a3eb3c/volumes" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.927769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wc247" event={"ID":"f2bfa70a-49e1-4083-80fd-8e32e354de04","Type":"ContainerDied","Data":"3ec7550ef786a66bca683725bebc9dc87d1faceb96ba0c147f4d36051b39dd27"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.927808 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wc247" event={"ID":"f2bfa70a-49e1-4083-80fd-8e32e354de04","Type":"ContainerStarted","Data":"f1fa740aec3c7bb64f688fada4cad279a4b4bed0ae251d19d10e90017f14dfde"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.927824 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b62a-account-create-update-nv688" event={"ID":"6fcc928e-076c-4853-9d29-56522dc04fd8","Type":"ContainerStarted","Data":"da71358e64518aa19574b796d779780639ef71e800928d2177f0009f8213df53"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.927839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b62a-account-create-update-nv688" event={"ID":"6fcc928e-076c-4853-9d29-56522dc04fd8","Type":"ContainerStarted","Data":"1c70262a2e0a0add876a4a7a95f05e7a666dca7f3e33ce2df45f610a8bf1651b"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.930025 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwp5d" event={"ID":"e3e75bad-e5f1-4db7-abd4-b64a956a01bd","Type":"ContainerStarted","Data":"70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.939260 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b74a-account-create-update-tcwmk" event={"ID":"ecaeba91-3e2d-404d-97fc-d43be6e0ac06","Type":"ContainerStarted","Data":"0d7d4acbb5200f5a67c6200cfe1f9c8f5d38a31052d0b86a271426134df2d2d7"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.939379 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b74a-account-create-update-tcwmk" event={"ID":"ecaeba91-3e2d-404d-97fc-d43be6e0ac06","Type":"ContainerStarted","Data":"51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.960966 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-705f-account-create-update-8dpgv" event={"ID":"958c379a-8eea-4bb9-8e49-57a92168cf30","Type":"ContainerStarted","Data":"55c9471a30cf978356ac54f66c947634b70256a14025c0a727307ad3c0cb08e8"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.961107 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-705f-account-create-update-8dpgv" event={"ID":"958c379a-8eea-4bb9-8e49-57a92168cf30","Type":"ContainerStarted","Data":"d0f87e51904cc585ea89d65e46c12787c6ec35a99c3b7a7515041bd8dd5ed6d2"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.964906 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-dn8jg" event={"ID":"0642bc5b-899a-4334-80ff-4eac919be523","Type":"ContainerStarted","Data":"8e2f1dc178baf56336675d0b42ac189f65f4b7541ca3c91397618e7f30d88e17"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.964965 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-dn8jg" event={"ID":"0642bc5b-899a-4334-80ff-4eac919be523","Type":"ContainerStarted","Data":"660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.968177 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-25bph" event={"ID":"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8","Type":"ContainerStarted","Data":"44370480737c59d5703c20d4787a738c2ec45845317fde8f91e5d2a5f0c8796a"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.968263 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-25bph" event={"ID":"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8","Type":"ContainerStarted","Data":"34061939f95a2ef1b9914190b49178c182179ba7005e6b37242f8004e4afc478"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.968720 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b62a-account-create-update-nv688" podStartSLOduration=4.968685557 podStartE2EDuration="4.968685557s" podCreationTimestamp="2025-12-04 12:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:41.950952054 +0000 UTC m=+1344.992398631" watchObservedRunningTime="2025-12-04 12:35:41.968685557 +0000 UTC m=+1345.010132134" Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.977744 4760 generic.go:334] "Generic (PLEG): container finished" podID="34e68130-6cfc-4349-9ca6-1eaa2690e632" containerID="06922d91c6e7d0cddbbd78fa5d91ff6b27d98a19fa8acf6ae0e67d00590d1eed" exitCode=0 Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.978201 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xc9fh" event={"ID":"34e68130-6cfc-4349-9ca6-1eaa2690e632","Type":"ContainerDied","Data":"06922d91c6e7d0cddbbd78fa5d91ff6b27d98a19fa8acf6ae0e67d00590d1eed"} Dec 04 12:35:41 crc kubenswrapper[4760]: I1204 12:35:41.996831 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b74a-account-create-update-tcwmk" podStartSLOduration=3.99679801 podStartE2EDuration="3.99679801s" podCreationTimestamp="2025-12-04 12:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:41.977835677 +0000 UTC m=+1345.019282234" watchObservedRunningTime="2025-12-04 12:35:41.99679801 +0000 UTC m=+1345.038244587" Dec 04 12:35:42 crc kubenswrapper[4760]: I1204 12:35:42.022154 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-dn8jg" podStartSLOduration=4.022120103 podStartE2EDuration="4.022120103s" podCreationTimestamp="2025-12-04 12:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:42.008916234 +0000 UTC m=+1345.050362801" watchObservedRunningTime="2025-12-04 12:35:42.022120103 +0000 UTC m=+1345.063566670" Dec 04 12:35:42 crc kubenswrapper[4760]: I1204 12:35:42.069789 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-705f-account-create-update-8dpgv" podStartSLOduration=4.069759926 podStartE2EDuration="4.069759926s" podCreationTimestamp="2025-12-04 12:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:42.062478304 +0000 UTC m=+1345.103924871" watchObservedRunningTime="2025-12-04 12:35:42.069759926 +0000 UTC m=+1345.111206493" Dec 04 12:35:42 crc kubenswrapper[4760]: I1204 12:35:42.998463 4760 generic.go:334] "Generic (PLEG): container finished" podID="cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" containerID="44370480737c59d5703c20d4787a738c2ec45845317fde8f91e5d2a5f0c8796a" exitCode=0 Dec 04 12:35:42 crc kubenswrapper[4760]: I1204 12:35:42.998564 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-25bph" event={"ID":"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8","Type":"ContainerDied","Data":"44370480737c59d5703c20d4787a738c2ec45845317fde8f91e5d2a5f0c8796a"} Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.008538 4760 generic.go:334] "Generic (PLEG): container finished" podID="e5367e8d-0a58-4d48-aada-d676ca7f78a0" containerID="8eb75542bba557f19249684fb93dcca1b02bf49a5419db18cbf19c08cdb43f48" exitCode=0 Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.008685 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cf87-account-create-update-8lng5" event={"ID":"e5367e8d-0a58-4d48-aada-d676ca7f78a0","Type":"ContainerDied","Data":"8eb75542bba557f19249684fb93dcca1b02bf49a5419db18cbf19c08cdb43f48"} Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.027923 4760 generic.go:334] "Generic (PLEG): container finished" podID="958c379a-8eea-4bb9-8e49-57a92168cf30" containerID="55c9471a30cf978356ac54f66c947634b70256a14025c0a727307ad3c0cb08e8" exitCode=0 Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.028194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-705f-account-create-update-8dpgv" event={"ID":"958c379a-8eea-4bb9-8e49-57a92168cf30","Type":"ContainerDied","Data":"55c9471a30cf978356ac54f66c947634b70256a14025c0a727307ad3c0cb08e8"} Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.037009 4760 generic.go:334] "Generic (PLEG): container finished" podID="6fcc928e-076c-4853-9d29-56522dc04fd8" containerID="da71358e64518aa19574b796d779780639ef71e800928d2177f0009f8213df53" exitCode=0 Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.037174 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b62a-account-create-update-nv688" event={"ID":"6fcc928e-076c-4853-9d29-56522dc04fd8","Type":"ContainerDied","Data":"da71358e64518aa19574b796d779780639ef71e800928d2177f0009f8213df53"} Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.042526 4760 generic.go:334] "Generic (PLEG): container finished" podID="0642bc5b-899a-4334-80ff-4eac919be523" containerID="8e2f1dc178baf56336675d0b42ac189f65f4b7541ca3c91397618e7f30d88e17" exitCode=0 Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.042722 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-dn8jg" event={"ID":"0642bc5b-899a-4334-80ff-4eac919be523","Type":"ContainerDied","Data":"8e2f1dc178baf56336675d0b42ac189f65f4b7541ca3c91397618e7f30d88e17"} Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.048351 4760 generic.go:334] "Generic (PLEG): container finished" podID="ecaeba91-3e2d-404d-97fc-d43be6e0ac06" containerID="0d7d4acbb5200f5a67c6200cfe1f9c8f5d38a31052d0b86a271426134df2d2d7" exitCode=0 Dec 04 12:35:43 crc kubenswrapper[4760]: I1204 12:35:43.049341 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b74a-account-create-update-tcwmk" event={"ID":"ecaeba91-3e2d-404d-97fc-d43be6e0ac06","Type":"ContainerDied","Data":"0d7d4acbb5200f5a67c6200cfe1f9c8f5d38a31052d0b86a271426134df2d2d7"} Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.106921 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-25bph" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.123059 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-25bph" event={"ID":"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8","Type":"ContainerDied","Data":"34061939f95a2ef1b9914190b49178c182179ba7005e6b37242f8004e4afc478"} Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.123111 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34061939f95a2ef1b9914190b49178c182179ba7005e6b37242f8004e4afc478" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.201075 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzzt\" (UniqueName: \"kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt\") pod \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.201453 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts\") pod \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\" (UID: \"cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.203569 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" (UID: "cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.308712 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.340189 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt" (OuterVolumeSpecName: "kube-api-access-bkzzt") pod "cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" (UID: "cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8"). InnerVolumeSpecName "kube-api-access-bkzzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.342916 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wc247" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.363189 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.410097 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgz9m\" (UniqueName: \"kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m\") pod \"f2bfa70a-49e1-4083-80fd-8e32e354de04\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.410755 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts\") pod \"f2bfa70a-49e1-4083-80fd-8e32e354de04\" (UID: \"f2bfa70a-49e1-4083-80fd-8e32e354de04\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.410917 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts\") pod \"34e68130-6cfc-4349-9ca6-1eaa2690e632\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.411017 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mqd\" (UniqueName: \"kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd\") pod \"34e68130-6cfc-4349-9ca6-1eaa2690e632\" (UID: \"34e68130-6cfc-4349-9ca6-1eaa2690e632\") " Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.411858 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkzzt\" (UniqueName: \"kubernetes.io/projected/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8-kube-api-access-bkzzt\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.412624 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2bfa70a-49e1-4083-80fd-8e32e354de04" (UID: "f2bfa70a-49e1-4083-80fd-8e32e354de04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.413714 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34e68130-6cfc-4349-9ca6-1eaa2690e632" (UID: "34e68130-6cfc-4349-9ca6-1eaa2690e632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.421595 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m" (OuterVolumeSpecName: "kube-api-access-wgz9m") pod "f2bfa70a-49e1-4083-80fd-8e32e354de04" (UID: "f2bfa70a-49e1-4083-80fd-8e32e354de04"). InnerVolumeSpecName "kube-api-access-wgz9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.421843 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd" (OuterVolumeSpecName: "kube-api-access-v9mqd") pod "34e68130-6cfc-4349-9ca6-1eaa2690e632" (UID: "34e68130-6cfc-4349-9ca6-1eaa2690e632"). InnerVolumeSpecName "kube-api-access-v9mqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.738874 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2bfa70a-49e1-4083-80fd-8e32e354de04-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.738926 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34e68130-6cfc-4349-9ca6-1eaa2690e632-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.738944 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mqd\" (UniqueName: \"kubernetes.io/projected/34e68130-6cfc-4349-9ca6-1eaa2690e632-kube-api-access-v9mqd\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:44 crc kubenswrapper[4760]: I1204 12:35:44.738961 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgz9m\" (UniqueName: \"kubernetes.io/projected/f2bfa70a-49e1-4083-80fd-8e32e354de04-kube-api-access-wgz9m\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.150880 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wc247" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.151206 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wc247" event={"ID":"f2bfa70a-49e1-4083-80fd-8e32e354de04","Type":"ContainerDied","Data":"f1fa740aec3c7bb64f688fada4cad279a4b4bed0ae251d19d10e90017f14dfde"} Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.152582 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1fa740aec3c7bb64f688fada4cad279a4b4bed0ae251d19d10e90017f14dfde" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.159878 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-25bph" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.170595 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xc9fh" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.181008 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xc9fh" event={"ID":"34e68130-6cfc-4349-9ca6-1eaa2690e632","Type":"ContainerDied","Data":"c56ec5a142151c1a5ed18dc16810923a773ffb67a562af98b09877c6015b93a2"} Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.181154 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56ec5a142151c1a5ed18dc16810923a773ffb67a562af98b09877c6015b93a2" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.300559 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.459925 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts\") pod \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.460229 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jmh7\" (UniqueName: \"kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7\") pod \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\" (UID: \"e5367e8d-0a58-4d48-aada-d676ca7f78a0\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.464927 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5367e8d-0a58-4d48-aada-d676ca7f78a0" (UID: "e5367e8d-0a58-4d48-aada-d676ca7f78a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.474526 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7" (OuterVolumeSpecName: "kube-api-access-2jmh7") pod "e5367e8d-0a58-4d48-aada-d676ca7f78a0" (UID: "e5367e8d-0a58-4d48-aada-d676ca7f78a0"). InnerVolumeSpecName "kube-api-access-2jmh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.565021 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jmh7\" (UniqueName: \"kubernetes.io/projected/e5367e8d-0a58-4d48-aada-d676ca7f78a0-kube-api-access-2jmh7\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.565627 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5367e8d-0a58-4d48-aada-d676ca7f78a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.640421 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.692449 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.705155 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.724749 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.773430 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2s2c\" (UniqueName: \"kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c\") pod \"0642bc5b-899a-4334-80ff-4eac919be523\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.773668 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts\") pod \"0642bc5b-899a-4334-80ff-4eac919be523\" (UID: \"0642bc5b-899a-4334-80ff-4eac919be523\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.782248 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0642bc5b-899a-4334-80ff-4eac919be523" (UID: "0642bc5b-899a-4334-80ff-4eac919be523"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.783135 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c" (OuterVolumeSpecName: "kube-api-access-k2s2c") pod "0642bc5b-899a-4334-80ff-4eac919be523" (UID: "0642bc5b-899a-4334-80ff-4eac919be523"). InnerVolumeSpecName "kube-api-access-k2s2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.876509 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4ffw\" (UniqueName: \"kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw\") pod \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.876686 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlw9\" (UniqueName: \"kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9\") pod \"958c379a-8eea-4bb9-8e49-57a92168cf30\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.876722 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdcv\" (UniqueName: \"kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv\") pod \"6fcc928e-076c-4853-9d29-56522dc04fd8\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.876885 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts\") pod \"6fcc928e-076c-4853-9d29-56522dc04fd8\" (UID: \"6fcc928e-076c-4853-9d29-56522dc04fd8\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.877032 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts\") pod \"958c379a-8eea-4bb9-8e49-57a92168cf30\" (UID: \"958c379a-8eea-4bb9-8e49-57a92168cf30\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.877138 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts\") pod \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\" (UID: \"ecaeba91-3e2d-404d-97fc-d43be6e0ac06\") " Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.878654 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2s2c\" (UniqueName: \"kubernetes.io/projected/0642bc5b-899a-4334-80ff-4eac919be523-kube-api-access-k2s2c\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.878692 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0642bc5b-899a-4334-80ff-4eac919be523-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.883258 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecaeba91-3e2d-404d-97fc-d43be6e0ac06" (UID: "ecaeba91-3e2d-404d-97fc-d43be6e0ac06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.887648 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fcc928e-076c-4853-9d29-56522dc04fd8" (UID: "6fcc928e-076c-4853-9d29-56522dc04fd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.888183 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "958c379a-8eea-4bb9-8e49-57a92168cf30" (UID: "958c379a-8eea-4bb9-8e49-57a92168cf30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.892334 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw" (OuterVolumeSpecName: "kube-api-access-g4ffw") pod "ecaeba91-3e2d-404d-97fc-d43be6e0ac06" (UID: "ecaeba91-3e2d-404d-97fc-d43be6e0ac06"). InnerVolumeSpecName "kube-api-access-g4ffw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.910771 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9" (OuterVolumeSpecName: "kube-api-access-qrlw9") pod "958c379a-8eea-4bb9-8e49-57a92168cf30" (UID: "958c379a-8eea-4bb9-8e49-57a92168cf30"). InnerVolumeSpecName "kube-api-access-qrlw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.917619 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv" (OuterVolumeSpecName: "kube-api-access-zfdcv") pod "6fcc928e-076c-4853-9d29-56522dc04fd8" (UID: "6fcc928e-076c-4853-9d29-56522dc04fd8"). InnerVolumeSpecName "kube-api-access-zfdcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998313 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/958c379a-8eea-4bb9-8e49-57a92168cf30-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998382 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998396 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4ffw\" (UniqueName: \"kubernetes.io/projected/ecaeba91-3e2d-404d-97fc-d43be6e0ac06-kube-api-access-g4ffw\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998413 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlw9\" (UniqueName: \"kubernetes.io/projected/958c379a-8eea-4bb9-8e49-57a92168cf30-kube-api-access-qrlw9\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998426 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdcv\" (UniqueName: \"kubernetes.io/projected/6fcc928e-076c-4853-9d29-56522dc04fd8-kube-api-access-zfdcv\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:45 crc kubenswrapper[4760]: I1204 12:35:45.998438 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fcc928e-076c-4853-9d29-56522dc04fd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.201395 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-dn8jg" event={"ID":"0642bc5b-899a-4334-80ff-4eac919be523","Type":"ContainerDied","Data":"660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352"} Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.201458 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-dn8jg" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.201482 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="660828eb5b25b349ec0f6c8e07b862bdbc9029230dd33e4ef58243c787f32352" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.205730 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b74a-account-create-update-tcwmk" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.205751 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b74a-account-create-update-tcwmk" event={"ID":"ecaeba91-3e2d-404d-97fc-d43be6e0ac06","Type":"ContainerDied","Data":"51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575"} Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.205798 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51160173d2807ed79814483fb9efaebdaeec53d023f10e1c7dbf39edac9e6575" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.214830 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cf87-account-create-update-8lng5" event={"ID":"e5367e8d-0a58-4d48-aada-d676ca7f78a0","Type":"ContainerDied","Data":"692a89400c4ef5004db86509c74795ffaa796b6566a12c44271d05a80e519c5b"} Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.214897 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692a89400c4ef5004db86509c74795ffaa796b6566a12c44271d05a80e519c5b" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.215035 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cf87-account-create-update-8lng5" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.221760 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-705f-account-create-update-8dpgv" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.221759 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-705f-account-create-update-8dpgv" event={"ID":"958c379a-8eea-4bb9-8e49-57a92168cf30","Type":"ContainerDied","Data":"d0f87e51904cc585ea89d65e46c12787c6ec35a99c3b7a7515041bd8dd5ed6d2"} Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.221909 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f87e51904cc585ea89d65e46c12787c6ec35a99c3b7a7515041bd8dd5ed6d2" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.227730 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b62a-account-create-update-nv688" event={"ID":"6fcc928e-076c-4853-9d29-56522dc04fd8","Type":"ContainerDied","Data":"1c70262a2e0a0add876a4a7a95f05e7a666dca7f3e33ce2df45f610a8bf1651b"} Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.227783 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c70262a2e0a0add876a4a7a95f05e7a666dca7f3e33ce2df45f610a8bf1651b" Dec 04 12:35:46 crc kubenswrapper[4760]: I1204 12:35:46.227935 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b62a-account-create-update-nv688" Dec 04 12:35:47 crc kubenswrapper[4760]: I1204 12:35:47.319987 4760 generic.go:334] "Generic (PLEG): container finished" podID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" containerID="0f91f22ee3cee8f1e541180a1829532b03212e89dddd8bb0c41db66a172d6964" exitCode=0 Dec 04 12:35:47 crc kubenswrapper[4760]: I1204 12:35:47.320491 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vmvd8" event={"ID":"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74","Type":"ContainerDied","Data":"0f91f22ee3cee8f1e541180a1829532b03212e89dddd8bb0c41db66a172d6964"} Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.230109 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vmvd8" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.388104 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vmvd8" event={"ID":"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74","Type":"ContainerDied","Data":"dcfacd09363d863e4bc606f4e14e070b19677bd09aace4213f26ee90b17f4af8"} Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.388175 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcfacd09363d863e4bc606f4e14e070b19677bd09aace4213f26ee90b17f4af8" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.388180 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vmvd8" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.393140 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle\") pod \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.393585 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2wtz\" (UniqueName: \"kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz\") pod \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.393689 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data\") pod \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.393851 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data\") pod \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\" (UID: \"2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74\") " Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.404532 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz" (OuterVolumeSpecName: "kube-api-access-m2wtz") pod "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" (UID: "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74"). InnerVolumeSpecName "kube-api-access-m2wtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.408614 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" (UID: "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.429888 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" (UID: "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.458325 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data" (OuterVolumeSpecName: "config-data") pod "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" (UID: "2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.498607 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2wtz\" (UniqueName: \"kubernetes.io/projected/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-kube-api-access-m2wtz\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.498656 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.498691 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:51 crc kubenswrapper[4760]: I1204 12:35:51.498701 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:51 crc kubenswrapper[4760]: E1204 12:35:51.971628 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.562811 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwp5d" event={"ID":"e3e75bad-e5f1-4db7-abd4-b64a956a01bd","Type":"ContainerStarted","Data":"d252a93ff61731da07bb6354c4d3cf65978377efb0f7b3926d80b2e6af41de99"} Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.601001 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wwp5d" podStartSLOduration=4.095185133 podStartE2EDuration="14.600979628s" podCreationTimestamp="2025-12-04 12:35:38 +0000 UTC" firstStartedPulling="2025-12-04 12:35:40.898374962 +0000 UTC m=+1343.939821529" lastFinishedPulling="2025-12-04 12:35:51.404169467 +0000 UTC m=+1354.445616024" observedRunningTime="2025-12-04 12:35:52.592819959 +0000 UTC m=+1355.634266526" watchObservedRunningTime="2025-12-04 12:35:52.600979628 +0000 UTC m=+1355.642426195" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.934936 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935629 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958c379a-8eea-4bb9-8e49-57a92168cf30" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935655 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="958c379a-8eea-4bb9-8e49-57a92168cf30" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935675 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935684 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935716 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" containerName="glance-db-sync" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935727 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" containerName="glance-db-sync" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935736 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bfa70a-49e1-4083-80fd-8e32e354de04" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935743 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bfa70a-49e1-4083-80fd-8e32e354de04" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935761 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0642bc5b-899a-4334-80ff-4eac919be523" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935772 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0642bc5b-899a-4334-80ff-4eac919be523" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935799 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fcc928e-076c-4853-9d29-56522dc04fd8" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935809 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fcc928e-076c-4853-9d29-56522dc04fd8" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935830 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5367e8d-0a58-4d48-aada-d676ca7f78a0" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935839 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5367e8d-0a58-4d48-aada-d676ca7f78a0" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935857 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e68130-6cfc-4349-9ca6-1eaa2690e632" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935864 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e68130-6cfc-4349-9ca6-1eaa2690e632" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935879 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="dnsmasq-dns" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935887 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="dnsmasq-dns" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935898 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="init" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935906 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="init" Dec 04 12:35:52 crc kubenswrapper[4760]: E1204 12:35:52.935923 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecaeba91-3e2d-404d-97fc-d43be6e0ac06" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.935932 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecaeba91-3e2d-404d-97fc-d43be6e0ac06" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936154 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="958c379a-8eea-4bb9-8e49-57a92168cf30" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936188 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecaeba91-3e2d-404d-97fc-d43be6e0ac06" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936202 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fcc928e-076c-4853-9d29-56522dc04fd8" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936237 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e68130-6cfc-4349-9ca6-1eaa2690e632" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936252 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0642bc5b-899a-4334-80ff-4eac919be523" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936267 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bfa70a-49e1-4083-80fd-8e32e354de04" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936279 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5367e8d-0a58-4d48-aada-d676ca7f78a0" containerName="mariadb-account-create-update" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936293 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" containerName="glance-db-sync" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936302 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" containerName="mariadb-database-create" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.936317 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65c3674-23c1-453d-8439-079822a3eb3c" containerName="dnsmasq-dns" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.938882 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.945435 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.945515 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.945544 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.945743 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.946118 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.946165 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsd6\" (UniqueName: \"kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:52 crc kubenswrapper[4760]: I1204 12:35:52.950548 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.048930 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.049012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsd6\" (UniqueName: \"kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.049111 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.049235 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.049283 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.049381 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.050817 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.051652 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.051708 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.051964 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.052674 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.082590 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsd6\" (UniqueName: \"kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6\") pod \"dnsmasq-dns-74f6bcbc87-crdnj\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:53 crc kubenswrapper[4760]: I1204 12:35:53.265401 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:54 crc kubenswrapper[4760]: I1204 12:35:54.063101 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:35:54 crc kubenswrapper[4760]: I1204 12:35:54.694137 4760 generic.go:334] "Generic (PLEG): container finished" podID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerID="1302d837b1971ee9c942f1a375f4696dec82f5a113d6f7864d8e42830b30c5cc" exitCode=0 Dec 04 12:35:54 crc kubenswrapper[4760]: I1204 12:35:54.694261 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" event={"ID":"c0bd9d36-6fb6-40f7-95f3-1393b8db1261","Type":"ContainerDied","Data":"1302d837b1971ee9c942f1a375f4696dec82f5a113d6f7864d8e42830b30c5cc"} Dec 04 12:35:54 crc kubenswrapper[4760]: I1204 12:35:54.694822 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" event={"ID":"c0bd9d36-6fb6-40f7-95f3-1393b8db1261","Type":"ContainerStarted","Data":"bd12605fac177480bbdb857154cb999b625f703e8f75336248aad0be81e13ffd"} Dec 04 12:35:55 crc kubenswrapper[4760]: I1204 12:35:55.711681 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" event={"ID":"c0bd9d36-6fb6-40f7-95f3-1393b8db1261","Type":"ContainerStarted","Data":"8a08d148baa0632676b2afc355f7c7a9b8adbaa9c02a366a0b15e7a0488d6b85"} Dec 04 12:35:55 crc kubenswrapper[4760]: I1204 12:35:55.712373 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:35:55 crc kubenswrapper[4760]: I1204 12:35:55.749118 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" podStartSLOduration=3.749090269 podStartE2EDuration="3.749090269s" podCreationTimestamp="2025-12-04 12:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:35:55.740489176 +0000 UTC m=+1358.781935743" watchObservedRunningTime="2025-12-04 12:35:55.749090269 +0000 UTC m=+1358.790536826" Dec 04 12:35:57 crc kubenswrapper[4760]: I1204 12:35:57.739820 4760 generic.go:334] "Generic (PLEG): container finished" podID="e3e75bad-e5f1-4db7-abd4-b64a956a01bd" containerID="d252a93ff61731da07bb6354c4d3cf65978377efb0f7b3926d80b2e6af41de99" exitCode=0 Dec 04 12:35:57 crc kubenswrapper[4760]: I1204 12:35:57.739903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwp5d" event={"ID":"e3e75bad-e5f1-4db7-abd4-b64a956a01bd","Type":"ContainerDied","Data":"d252a93ff61731da07bb6354c4d3cf65978377efb0f7b3926d80b2e6af41de99"} Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.150411 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.330546 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle\") pod \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.331985 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data\") pod \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.332101 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlnjc\" (UniqueName: \"kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc\") pod \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\" (UID: \"e3e75bad-e5f1-4db7-abd4-b64a956a01bd\") " Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.352288 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc" (OuterVolumeSpecName: "kube-api-access-vlnjc") pod "e3e75bad-e5f1-4db7-abd4-b64a956a01bd" (UID: "e3e75bad-e5f1-4db7-abd4-b64a956a01bd"). InnerVolumeSpecName "kube-api-access-vlnjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.379015 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3e75bad-e5f1-4db7-abd4-b64a956a01bd" (UID: "e3e75bad-e5f1-4db7-abd4-b64a956a01bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.420555 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data" (OuterVolumeSpecName: "config-data") pod "e3e75bad-e5f1-4db7-abd4-b64a956a01bd" (UID: "e3e75bad-e5f1-4db7-abd4-b64a956a01bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.433782 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.433825 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.433842 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlnjc\" (UniqueName: \"kubernetes.io/projected/e3e75bad-e5f1-4db7-abd4-b64a956a01bd-kube-api-access-vlnjc\") on node \"crc\" DevicePath \"\"" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.767282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wwp5d" event={"ID":"e3e75bad-e5f1-4db7-abd4-b64a956a01bd","Type":"ContainerDied","Data":"70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d"} Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.767856 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c1a63970f1d4df6e931550296a4730a47ae6ba218a5226f2ab58f89bd5b62d" Dec 04 12:35:59 crc kubenswrapper[4760]: I1204 12:35:59.768078 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wwp5d" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.579681 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-57zpb"] Dec 04 12:36:00 crc kubenswrapper[4760]: E1204 12:36:00.580385 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e75bad-e5f1-4db7-abd4-b64a956a01bd" containerName="keystone-db-sync" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.580409 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e75bad-e5f1-4db7-abd4-b64a956a01bd" containerName="keystone-db-sync" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.580727 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e75bad-e5f1-4db7-abd4-b64a956a01bd" containerName="keystone-db-sync" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.582517 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.587009 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.587050 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnvh9" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.587418 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.587548 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.589756 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.590034 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.590201 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.590382 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.590701 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x4zw\" (UniqueName: \"kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.590843 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.596842 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.607896 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-57zpb"] Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.691796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.691945 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.692008 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.692065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.692159 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x4zw\" (UniqueName: \"kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:00 crc kubenswrapper[4760]: I1204 12:36:00.692233 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:00.714092 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:00.989334 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.089347 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.115034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.126653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.152130 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x4zw\" (UniqueName: \"kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw\") pod \"keystone-bootstrap-57zpb\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.176720 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-46wx2"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.178897 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.216792 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-g6rbc" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.217166 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.217581 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.232500 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.232836 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.232894 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.232960 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.233139 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gh5\" (UniqueName: \"kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.233238 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.233350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.300247 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-46wx2"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.339725 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.346096 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.346495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.346745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.347183 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gh5\" (UniqueName: \"kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.347426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.347635 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.347919 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.358478 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.388118 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.389905 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.389971 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.390402 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="dnsmasq-dns" containerID="cri-o://8a08d148baa0632676b2afc355f7c7a9b8adbaa9c02a366a0b15e7a0488d6b85" gracePeriod=10 Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.391594 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.409224 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bc6ks" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.411429 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.431768 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.432114 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.433665 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.451330 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.457565 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqgfn\" (UniqueName: \"kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.457739 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.457964 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.458029 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.458292 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.469143 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.490164 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gh5\" (UniqueName: \"kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5\") pod \"cinder-db-sync-46wx2\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.579030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.579145 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.583005 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-46wx2" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.618462 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.636669 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.636770 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqgfn\" (UniqueName: \"kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.636888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.637554 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.683197 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.744922 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-rb22w"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.763127 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.765917 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqgfn\" (UniqueName: \"kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.781995 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key\") pod \"horizon-84489dfbd7-57pm6\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.803277 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-tnswh" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.812426 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.855665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpr4j\" (UniqueName: \"kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.855825 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.855948 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.856095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.868708 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.872336 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:01 crc kubenswrapper[4760]: I1204 12:36:01.895352 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.224030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpr4j\" (UniqueName: \"kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.224563 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.224622 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.224706 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.239425 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.244322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.253748 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.302711 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jl9zh"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.304491 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rb22w"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.304645 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.307664 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.316326 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.316740 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q74n6" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.318317 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpr4j\" (UniqueName: \"kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j\") pod \"manila-db-sync-rb22w\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.326053 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.326411 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327473 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327641 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327688 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327782 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.327812 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vm47\" (UniqueName: \"kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.344837 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.360832 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.361073 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.362385 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jl9zh"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.369058 4760 generic.go:334] "Generic (PLEG): container finished" podID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerID="8a08d148baa0632676b2afc355f7c7a9b8adbaa9c02a366a0b15e7a0488d6b85" exitCode=0 Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.369160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" event={"ID":"c0bd9d36-6fb6-40f7-95f3-1393b8db1261","Type":"ContainerDied","Data":"8a08d148baa0632676b2afc355f7c7a9b8adbaa9c02a366a0b15e7a0488d6b85"} Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.385053 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.430046 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rb22w" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431601 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431651 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkztl\" (UniqueName: \"kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431719 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431757 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431803 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vm47\" (UniqueName: \"kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431850 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.431957 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432021 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432073 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4gtd\" (UniqueName: \"kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432158 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432187 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432244 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.432404 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.434427 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.446761 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.447515 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.449775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.455586 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.459435 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rqwnf"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.461610 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.471512 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vm47\" (UniqueName: \"kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47\") pod \"dnsmasq-dns-847c4cc679-zjxh7\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.472083 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.472487 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjwwq" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.474897 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rqwnf"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.487900 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.526693 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.532330 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nmzx2"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.534972 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.537696 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkztl\" (UniqueName: \"kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.538745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.539869 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.541752 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542006 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4gtd\" (UniqueName: \"kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542669 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542761 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.542987 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.540331 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.551275 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.541614 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.540402 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.540457 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8kw6z" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.566202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.566708 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.580344 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.589580 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.590037 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.594825 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.597072 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nmzx2"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.612983 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4gtd\" (UniqueName: \"kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd\") pod \"ceilometer-0\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " pod="openstack/ceilometer-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.618370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkztl\" (UniqueName: \"kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl\") pod \"neutron-db-sync-jl9zh\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.627655 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.634271 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.651163 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652402 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652534 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4nv\" (UniqueName: \"kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652588 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652619 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652681 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8gw\" (UniqueName: \"kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.652760 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.668113 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.684917 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.685067 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.727569 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.729875 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.739509 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mvczn" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.740051 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.740248 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.740395 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 12:36:02 crc kubenswrapper[4760]: I1204 12:36:02.740556 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.804423 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh7cl\" (UniqueName: \"kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.804610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.804735 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.804916 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.805146 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.805954 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806102 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806280 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4nv\" (UniqueName: \"kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806509 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806656 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806689 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806904 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.806985 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8gw\" (UniqueName: \"kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.807148 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.809768 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.814234 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.817446 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.830510 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.850973 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.854569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.875337 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.876121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.881106 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8gw\" (UniqueName: \"kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw\") pod \"barbican-db-sync-rqwnf\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.889114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4nv\" (UniqueName: \"kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv\") pod \"placement-db-sync-nmzx2\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.913685 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.915975 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvb88\" (UniqueName: \"kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.916013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.916141 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.916185 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.919952 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936348 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936640 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936693 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936770 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936829 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftx7m\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936877 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh7cl\" (UniqueName: \"kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936943 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936967 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.936992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937050 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937094 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937165 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.937266 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.958706 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.959584 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.969448 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.986233 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.987983 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:02.996915 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.016087 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nmzx2" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.023825 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh7cl\" (UniqueName: \"kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl\") pod \"dnsmasq-dns-785d8bcb8c-d5x4k\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047401 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047537 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047580 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047626 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047792 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047871 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftx7m\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047914 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.047983 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.048012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.048064 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.048085 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.048138 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvb88\" (UniqueName: \"kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.065455 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.068647 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.068957 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.095956 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.102176 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.103309 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.116564 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.256541 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftx7m\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.256784 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.256849 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.263664 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.283726 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.305814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.362724 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.394016 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.394121 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.422853 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvb88\" (UniqueName: \"kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88\") pod \"horizon-6666859d7c-9bwlq\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.587323 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.649003 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.703608 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.704026 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.718703 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773141 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773359 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqk9\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773475 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773519 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773671 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773695 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773714 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773795 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.773911 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877584 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877640 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877692 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877761 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877837 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877906 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.877984 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqk9\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.878031 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.878085 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.880317 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.880650 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.880949 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.923972 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.924166 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.924343 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.935488 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.947496 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqk9\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:03 crc kubenswrapper[4760]: I1204 12:36:03.966126 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.017392 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.054119 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-57zpb"] Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.064880 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-46wx2"] Dec 04 12:36:04 crc kubenswrapper[4760]: E1204 12:36:04.152393 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.239918 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.266164 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.295250 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399041 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsd6\" (UniqueName: \"kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399174 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399265 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399394 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.399463 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb\") pod \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\" (UID: \"c0bd9d36-6fb6-40f7-95f3-1393b8db1261\") " Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.419910 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6" (OuterVolumeSpecName: "kube-api-access-wvsd6") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "kube-api-access-wvsd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.435683 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.446996 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.505128 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsd6\" (UniqueName: \"kubernetes.io/projected/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-kube-api-access-wvsd6\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.539068 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.539267 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" event={"ID":"c0bd9d36-6fb6-40f7-95f3-1393b8db1261","Type":"ContainerDied","Data":"bd12605fac177480bbdb857154cb999b625f703e8f75336248aad0be81e13ffd"} Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.539990 4760 scope.go:117] "RemoveContainer" containerID="8a08d148baa0632676b2afc355f7c7a9b8adbaa9c02a366a0b15e7a0488d6b85" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.561055 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config" (OuterVolumeSpecName: "config") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.568990 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57zpb" event={"ID":"2adf9c9c-a451-4484-8630-d28b66e8e567","Type":"ContainerStarted","Data":"c6e6bf5bd2fe229f27063a49f979e53bb698e5595a41365ddd6276b1a6e965c7"} Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.574054 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.589575 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-46wx2" event={"ID":"a8179a26-2281-4a5d-bc77-808a2f7e61bb","Type":"ContainerStarted","Data":"039259f79253606608032edf8d6888a8f357bb8e2e6a9242ad9f050e2792991c"} Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.593687 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.611912 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.611956 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.611968 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.630517 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.673479 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0bd9d36-6fb6-40f7-95f3-1393b8db1261" (UID: "c0bd9d36-6fb6-40f7-95f3-1393b8db1261"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.696557 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.717006 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.717053 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0bd9d36-6fb6-40f7-95f3-1393b8db1261-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.760913 4760 scope.go:117] "RemoveContainer" containerID="1302d837b1971ee9c942f1a375f4696dec82f5a113d6f7864d8e42830b30c5cc" Dec 04 12:36:04 crc kubenswrapper[4760]: I1204 12:36:04.858031 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:07 crc kubenswrapper[4760]: W1204 12:36:05.112651 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf17b640_201e_457f_baf2_67c1d767c77e.slice/crio-03d8d8aa22db3bcd7b8536109f70695fe3a910ec8e2aef860fd8dad03508b99a WatchSource:0}: Error finding container 03d8d8aa22db3bcd7b8536109f70695fe3a910ec8e2aef860fd8dad03508b99a: Status 404 returned error can't find the container with id 03d8d8aa22db3bcd7b8536109f70695fe3a910ec8e2aef860fd8dad03508b99a Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.510076 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rb22w"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.627258 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rb22w" event={"ID":"6e2d78cb-0c7a-408f-a736-6630b41bd80b","Type":"ContainerStarted","Data":"e0ce070d49b5495fa48fbcd38608443a459dfbd4a91e648839a36ecdb6a68db0"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.645727 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84489dfbd7-57pm6" event={"ID":"a1e1a276-940e-45e7-b6b3-f9650cbd653c","Type":"ContainerStarted","Data":"6ac727b71cc3440df93474f32003fc5abcb801be5d8d86c23e3855c69b7c57ce"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.674523 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" event={"ID":"df17b640-201e-457f-baf2-67c1d767c77e","Type":"ContainerStarted","Data":"03d8d8aa22db3bcd7b8536109f70695fe3a910ec8e2aef860fd8dad03508b99a"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.812814 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.829287 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-crdnj"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.969730 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" path="/var/lib/kubelet/pods/c0bd9d36-6fb6-40f7-95f3-1393b8db1261/volumes" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.978135 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jl9zh"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.978203 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nmzx2"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:05.978289 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rqwnf"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.131663 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.800676 4760 generic.go:334] "Generic (PLEG): container finished" podID="df17b640-201e-457f-baf2-67c1d767c77e" containerID="b9857b4260907f8af06caa6708a8f8e2cc49457a7f4f803d53a7a413edf2f3bb" exitCode=0 Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.801340 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" event={"ID":"df17b640-201e-457f-baf2-67c1d767c77e","Type":"ContainerDied","Data":"b9857b4260907f8af06caa6708a8f8e2cc49457a7f4f803d53a7a413edf2f3bb"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.838636 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerStarted","Data":"cd3adc897c69dfb42bb6fca985c1000bdf29435695513a637f21554b947b6409"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.882855 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57zpb" event={"ID":"2adf9c9c-a451-4484-8630-d28b66e8e567","Type":"ContainerStarted","Data":"42eacc3e69784f66561eae6cd92fe685384906e8e756b0eb71d12cbc6f7e98ae"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.899327 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgdz6" podUID="2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.70:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.926579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rqwnf" event={"ID":"640263be-b424-4ed1-b0f5-d4b9907113e2","Type":"ContainerStarted","Data":"de9710224142192f001fd11b973c93d999b81e9f217b5b158d1e93c4a8c90f7e"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.955702 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jl9zh" event={"ID":"40fefe31-76d7-458b-b4ef-fb49320cbb18","Type":"ContainerStarted","Data":"5551e9a8c6bfbc4526afcd0cfd2904800d4cb09646ae96bf9e71ee6d7eab81e7"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.955791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jl9zh" event={"ID":"40fefe31-76d7-458b-b4ef-fb49320cbb18","Type":"ContainerStarted","Data":"b592ab8c7b05ec27c95fe65f39b780d0dc8ca5b9b50032a77bf61624fe002d73"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.962558 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" podUID="e8a9a9f4-8e40-4506-9aeb-c3e83d62de39" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.72:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.963106 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-8c7dd" podUID="e8a9a9f4-8e40-4506-9aeb-c3e83d62de39" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.72:8081/readyz\": dial tcp 10.217.0.72:8081: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.978883 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-57zpb" podStartSLOduration=6.978842316 podStartE2EDuration="6.978842316s" podCreationTimestamp="2025-12-04 12:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:06.949495095 +0000 UTC m=+1369.990941662" watchObservedRunningTime="2025-12-04 12:36:06.978842316 +0000 UTC m=+1370.020288883" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:06.991320 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nmzx2" event={"ID":"fe6ec201-ccbb-4003-9893-13b6656a1624","Type":"ContainerStarted","Data":"6879120be722a47613daf71b70225d46f2614d279e96368790215906062a7ca2"} Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.045326 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jl9zh" podStartSLOduration=6.031707444 podStartE2EDuration="6.031707444s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:07.023641068 +0000 UTC m=+1370.065087635" watchObservedRunningTime="2025-12-04 12:36:07.031707444 +0000 UTC m=+1370.073154031" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.144165 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.394610 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.497239 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:07 crc kubenswrapper[4760]: E1204 12:36:07.497921 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="dnsmasq-dns" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.497954 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="dnsmasq-dns" Dec 04 12:36:07 crc kubenswrapper[4760]: E1204 12:36:07.497996 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="init" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.498003 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="init" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.507513 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="dnsmasq-dns" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.509641 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.658011 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.662013 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.662128 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.662150 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6flrv\" (UniqueName: \"kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.662224 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.662252 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.766201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.766291 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.766424 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.766551 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.766810 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flrv\" (UniqueName: \"kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.780042 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.781855 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.791946 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.792559 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.805996 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flrv\" (UniqueName: \"kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv\") pod \"horizon-5b5586b4cc-jx6kr\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:07 crc kubenswrapper[4760]: I1204 12:36:07.901307 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.096107 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.279048 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-crdnj" podUID="c0bd9d36-6fb6-40f7-95f3-1393b8db1261" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.399702 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.416924 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.629503 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.716575 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.716775 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.716858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vm47\" (UniqueName: \"kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.716967 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.716995 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.717594 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb\") pod \"df17b640-201e-457f-baf2-67c1d767c77e\" (UID: \"df17b640-201e-457f-baf2-67c1d767c77e\") " Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.735244 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47" (OuterVolumeSpecName: "kube-api-access-9vm47") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "kube-api-access-9vm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.766411 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.779426 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.794166 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.811694 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.826873 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.826927 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vm47\" (UniqueName: \"kubernetes.io/projected/df17b640-201e-457f-baf2-67c1d767c77e-kube-api-access-9vm47\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.826946 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.826961 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.838419 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config" (OuterVolumeSpecName: "config") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.844954 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df17b640-201e-457f-baf2-67c1d767c77e" (UID: "df17b640-201e-457f-baf2-67c1d767c77e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.895952 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:08 crc kubenswrapper[4760]: W1204 12:36:08.918471 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb14ef2a_1124_4b0f_9e0e_4772ae2b1e7f.slice/crio-405048ade6f366e417999cd9cd10c78a9dae5bd1f871bce4b4bc291a40815d13 WatchSource:0}: Error finding container 405048ade6f366e417999cd9cd10c78a9dae5bd1f871bce4b4bc291a40815d13: Status 404 returned error can't find the container with id 405048ade6f366e417999cd9cd10c78a9dae5bd1f871bce4b4bc291a40815d13 Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.928815 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:08 crc kubenswrapper[4760]: I1204 12:36:08.929098 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df17b640-201e-457f-baf2-67c1d767c77e-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.195764 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerStarted","Data":"03c6bb52da5200cd89e47ae105c0d34347a1c02c07e4e78e00069026f1458f88"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.206486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6666859d7c-9bwlq" event={"ID":"d9de2aa3-83f2-4701-b09e-00d0fab8403f","Type":"ContainerStarted","Data":"5e691e460f195b813b38c37ef98230d32f26d6b7759bf56a6ab0aab6f3067e70"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.219813 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" event={"ID":"df17b640-201e-457f-baf2-67c1d767c77e","Type":"ContainerDied","Data":"03d8d8aa22db3bcd7b8536109f70695fe3a910ec8e2aef860fd8dad03508b99a"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.219959 4760 scope.go:117] "RemoveContainer" containerID="b9857b4260907f8af06caa6708a8f8e2cc49457a7f4f803d53a7a413edf2f3bb" Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.220336 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-zjxh7" Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.253437 4760 generic.go:334] "Generic (PLEG): container finished" podID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerID="886a3015c4461c6a8f271dcda624ec4e3465b66918e422e487241938df9f2f01" exitCode=0 Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.253541 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" event={"ID":"dcf8aac0-0b5c-4170-b849-374f1f4fd65c","Type":"ContainerDied","Data":"886a3015c4461c6a8f271dcda624ec4e3465b66918e422e487241938df9f2f01"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.253588 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" event={"ID":"dcf8aac0-0b5c-4170-b849-374f1f4fd65c","Type":"ContainerStarted","Data":"d3542bc6231416c72f6ff5158c3d9f8d4512e7462265a6ce68e0b0d6cff0edcb"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.262008 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5586b4cc-jx6kr" event={"ID":"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f","Type":"ContainerStarted","Data":"405048ade6f366e417999cd9cd10c78a9dae5bd1f871bce4b4bc291a40815d13"} Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.327168 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.359037 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-zjxh7"] Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.598584 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:09 crc kubenswrapper[4760]: I1204 12:36:09.891660 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df17b640-201e-457f-baf2-67c1d767c77e" path="/var/lib/kubelet/pods/df17b640-201e-457f-baf2-67c1d767c77e/volumes" Dec 04 12:36:10 crc kubenswrapper[4760]: I1204 12:36:10.317064 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" event={"ID":"dcf8aac0-0b5c-4170-b849-374f1f4fd65c","Type":"ContainerStarted","Data":"5db3f6150ba5d584cfeef7ded8abacea57ed7ed3720f0750a2fdd96735285378"} Dec 04 12:36:10 crc kubenswrapper[4760]: I1204 12:36:10.317151 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:10 crc kubenswrapper[4760]: I1204 12:36:10.324996 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerStarted","Data":"e395d1eabefb893f68d82f2493bf5ca1e5e9278787a89461a5669f0f8cabb7fd"} Dec 04 12:36:10 crc kubenswrapper[4760]: I1204 12:36:10.359809 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" podStartSLOduration=9.359764177 podStartE2EDuration="9.359764177s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:10.355734999 +0000 UTC m=+1373.397181976" watchObservedRunningTime="2025-12-04 12:36:10.359764177 +0000 UTC m=+1373.401210744" Dec 04 12:36:11 crc kubenswrapper[4760]: I1204 12:36:11.426027 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerStarted","Data":"127ca536165f2ec8ae2b3fdc19bd5f97d83682253f188aefab8f3815e316ce64"} Dec 04 12:36:11 crc kubenswrapper[4760]: I1204 12:36:11.448785 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerStarted","Data":"c2eea84c49bbcd70070833e34e4eb146a61c5a55c94da1f30c54c7e7d322bf3c"} Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.471359 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.527310 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerStarted","Data":"fe3f7c1ab296fa50fb9b5fa7d663e62a408ed4f196559998ec7915953588d7b2"} Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.527638 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-log" containerID="cri-o://c2eea84c49bbcd70070833e34e4eb146a61c5a55c94da1f30c54c7e7d322bf3c" gracePeriod=30 Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.528521 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-httpd" containerID="cri-o://fe3f7c1ab296fa50fb9b5fa7d663e62a408ed4f196559998ec7915953588d7b2" gracePeriod=30 Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.546559 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:36:13 crc kubenswrapper[4760]: E1204 12:36:13.547419 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df17b640-201e-457f-baf2-67c1d767c77e" containerName="init" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.547441 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="df17b640-201e-457f-baf2-67c1d767c77e" containerName="init" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.547681 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="df17b640-201e-457f-baf2-67c1d767c77e" containerName="init" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.549023 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.560310 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.561124 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.603422 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.603817 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.603904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8gbs\" (UniqueName: \"kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.604070 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.604144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.604268 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.604395 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.621149 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.621117143 podStartE2EDuration="12.621117143s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:13.587784914 +0000 UTC m=+1376.629231481" watchObservedRunningTime="2025-12-04 12:36:13.621117143 +0000 UTC m=+1376.662563710" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.661080 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707574 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707648 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707712 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707765 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707805 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707866 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.707892 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8gbs\" (UniqueName: \"kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.709629 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.710472 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.714684 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.720321 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b7fc6c944-sh7tv"] Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.722640 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.745408 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.746531 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.754911 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.779628 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7fc6c944-sh7tv"] Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.796842 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8gbs\" (UniqueName: \"kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs\") pod \"horizon-66f8fb5648-87dff\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813142 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-tls-certs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-scripts\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813429 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-secret-key\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813460 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2khq\" (UniqueName: \"kubernetes.io/projected/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-kube-api-access-s2khq\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813558 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-combined-ca-bundle\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813631 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-config-data\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.813667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-logs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.912419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.915962 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-config-data\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.916042 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-logs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.916152 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-tls-certs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.916329 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-scripts\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.916499 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-secret-key\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.916540 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2khq\" (UniqueName: \"kubernetes.io/projected/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-kube-api-access-s2khq\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.918272 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-logs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.918292 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-scripts\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.919199 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-combined-ca-bundle\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.921851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-config-data\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.935767 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-combined-ca-bundle\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.942988 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-secret-key\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.957037 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-horizon-tls-certs\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:13 crc kubenswrapper[4760]: I1204 12:36:13.961826 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2khq\" (UniqueName: \"kubernetes.io/projected/a6452e5d-5eb7-4d21-96ea-eefbc327f2f5-kube-api-access-s2khq\") pod \"horizon-5b7fc6c944-sh7tv\" (UID: \"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5\") " pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.172972 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.251514 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.430491 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.437854 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="dnsmasq-dns" containerID="cri-o://7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b" gracePeriod=10 Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.627509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerStarted","Data":"ed29677e02dfd161e1309c1fb649940d5c5587818c866ff6c21352261d77e5e7"} Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.628278 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-log" containerID="cri-o://127ca536165f2ec8ae2b3fdc19bd5f97d83682253f188aefab8f3815e316ce64" gracePeriod=30 Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.629531 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-httpd" containerID="cri-o://ed29677e02dfd161e1309c1fb649940d5c5587818c866ff6c21352261d77e5e7" gracePeriod=30 Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.649610 4760 generic.go:334] "Generic (PLEG): container finished" podID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerID="c2eea84c49bbcd70070833e34e4eb146a61c5a55c94da1f30c54c7e7d322bf3c" exitCode=143 Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.649690 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerDied","Data":"c2eea84c49bbcd70070833e34e4eb146a61c5a55c94da1f30c54c7e7d322bf3c"} Dec 04 12:36:14 crc kubenswrapper[4760]: I1204 12:36:14.687307 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.687266206 podStartE2EDuration="13.687266206s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:14.682413211 +0000 UTC m=+1377.723859788" watchObservedRunningTime="2025-12-04 12:36:14.687266206 +0000 UTC m=+1377.728712793" Dec 04 12:36:15 crc kubenswrapper[4760]: E1204 12:36:15.080010 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1f93fe9_3eaf_4ed7_b303_4a5f00eaa213.slice/crio-conmon-7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce3174d_015c_4a85_b58d_af7603479902.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.329129 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.346440 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7fc6c944-sh7tv"] Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.700868 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc076227-def5-4f6f-8d73-5266e7237847" containerID="ed29677e02dfd161e1309c1fb649940d5c5587818c866ff6c21352261d77e5e7" exitCode=0 Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.700952 4760 generic.go:334] "Generic (PLEG): container finished" podID="bc076227-def5-4f6f-8d73-5266e7237847" containerID="127ca536165f2ec8ae2b3fdc19bd5f97d83682253f188aefab8f3815e316ce64" exitCode=143 Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.701103 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerDied","Data":"ed29677e02dfd161e1309c1fb649940d5c5587818c866ff6c21352261d77e5e7"} Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.701182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerDied","Data":"127ca536165f2ec8ae2b3fdc19bd5f97d83682253f188aefab8f3815e316ce64"} Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.715096 4760 generic.go:334] "Generic (PLEG): container finished" podID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerID="fe3f7c1ab296fa50fb9b5fa7d663e62a408ed4f196559998ec7915953588d7b2" exitCode=0 Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.715193 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerDied","Data":"fe3f7c1ab296fa50fb9b5fa7d663e62a408ed4f196559998ec7915953588d7b2"} Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.730579 4760 generic.go:334] "Generic (PLEG): container finished" podID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerID="7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b" exitCode=0 Dec 04 12:36:15 crc kubenswrapper[4760]: I1204 12:36:15.730680 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" event={"ID":"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213","Type":"ContainerDied","Data":"7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b"} Dec 04 12:36:16 crc kubenswrapper[4760]: I1204 12:36:16.755185 4760 generic.go:334] "Generic (PLEG): container finished" podID="2adf9c9c-a451-4484-8630-d28b66e8e567" containerID="42eacc3e69784f66561eae6cd92fe685384906e8e756b0eb71d12cbc6f7e98ae" exitCode=0 Dec 04 12:36:16 crc kubenswrapper[4760]: I1204 12:36:16.755383 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57zpb" event={"ID":"2adf9c9c-a451-4484-8630-d28b66e8e567","Type":"ContainerDied","Data":"42eacc3e69784f66561eae6cd92fe685384906e8e756b0eb71d12cbc6f7e98ae"} Dec 04 12:36:19 crc kubenswrapper[4760]: W1204 12:36:19.914529 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6452e5d_5eb7_4d21_96ea_eefbc327f2f5.slice/crio-1eb15146866d5b4f8eb6a4e5151c85986f0a46ad7ff47cc4f65ff5e02279f869 WatchSource:0}: Error finding container 1eb15146866d5b4f8eb6a4e5151c85986f0a46ad7ff47cc4f65ff5e02279f869: Status 404 returned error can't find the container with id 1eb15146866d5b4f8eb6a4e5151c85986f0a46ad7ff47cc4f65ff5e02279f869 Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.020526 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.256889 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.256997 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.257029 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.257087 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94svf\" (UniqueName: \"kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.257133 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.257185 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.274753 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf" (OuterVolumeSpecName: "kube-api-access-94svf") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "kube-api-access-94svf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.360727 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.361796 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") pod \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\" (UID: \"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213\") " Dec 04 12:36:20 crc kubenswrapper[4760]: W1204 12:36:20.361909 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.361925 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.362555 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94svf\" (UniqueName: \"kubernetes.io/projected/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-kube-api-access-94svf\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.362591 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.366866 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.374929 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.376382 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config" (OuterVolumeSpecName: "config") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.380870 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" (UID: "c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.465720 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.465768 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.465778 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.465788 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.822993 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerStarted","Data":"be6dafefd520de142d8f1d0b12d74c215d160de63da908763acfe2c8e19d2785"} Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.827120 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" event={"ID":"c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213","Type":"ContainerDied","Data":"9c600dd39c97a26ec7f0ba8cd2138d2fada28c72a86ff038d32acaeaccdd8dfe"} Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.827179 4760 scope.go:117] "RemoveContainer" containerID="7927cd2b5bb980fbc7da2a3fcde5347dce709caa429923ca2ece1c501e4d930b" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.827450 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.832991 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerStarted","Data":"1eb15146866d5b4f8eb6a4e5151c85986f0a46ad7ff47cc4f65ff5e02279f869"} Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.882914 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:36:20 crc kubenswrapper[4760]: I1204 12:36:20.894059 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tvztj"] Dec 04 12:36:21 crc kubenswrapper[4760]: I1204 12:36:21.881830 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" path="/var/lib/kubelet/pods/c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213/volumes" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.202109 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.215121 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.240428 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260041 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260128 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260164 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260198 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260246 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260270 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260287 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.260510 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.261009 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x4zw\" (UniqueName: \"kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262234 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262291 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262308 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262342 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262402 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts\") pod \"2adf9c9c-a451-4484-8630-d28b66e8e567\" (UID: \"2adf9c9c-a451-4484-8630-d28b66e8e567\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262429 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262460 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262486 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262515 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262558 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftx7m\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262581 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqk9\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262672 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262736 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262793 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph\") pod \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\" (UID: \"9b1ded96-8cc2-448f-8054-8f65f467ba9a\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.262823 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data\") pod \"bc076227-def5-4f6f-8d73-5266e7237847\" (UID: \"bc076227-def5-4f6f-8d73-5266e7237847\") " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.261516 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.272415 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts" (OuterVolumeSpecName: "scripts") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.272546 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts" (OuterVolumeSpecName: "scripts") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.272530 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph" (OuterVolumeSpecName: "ceph") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.276023 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.276086 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.276103 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.276113 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.276687 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs" (OuterVolumeSpecName: "logs") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.277049 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.277841 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m" (OuterVolumeSpecName: "kube-api-access-ftx7m") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "kube-api-access-ftx7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.286607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs" (OuterVolumeSpecName: "logs") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.300560 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.301693 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts" (OuterVolumeSpecName: "scripts") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.302314 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.305852 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.308356 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9" (OuterVolumeSpecName: "kube-api-access-8wqk9") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "kube-api-access-8wqk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.309803 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.331078 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw" (OuterVolumeSpecName: "kube-api-access-6x4zw") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "kube-api-access-6x4zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.331194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph" (OuterVolumeSpecName: "ceph") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.338092 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.374117 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data" (OuterVolumeSpecName: "config-data") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379280 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379324 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379341 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379353 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b1ded96-8cc2-448f-8054-8f65f467ba9a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379388 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379403 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqk9\" (UniqueName: \"kubernetes.io/projected/bc076227-def5-4f6f-8d73-5266e7237847-kube-api-access-8wqk9\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379416 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftx7m\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-kube-api-access-ftx7m\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379436 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379448 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9b1ded96-8cc2-448f-8054-8f65f467ba9a-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379460 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379472 4760 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379483 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379496 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc076227-def5-4f6f-8d73-5266e7237847-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.379509 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x4zw\" (UniqueName: \"kubernetes.io/projected/2adf9c9c-a451-4484-8630-d28b66e8e567-kube-api-access-6x4zw\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.382152 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.398933 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2adf9c9c-a451-4484-8630-d28b66e8e567" (UID: "2adf9c9c-a451-4484-8630-d28b66e8e567"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.410734 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data" (OuterVolumeSpecName: "config-data") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.420525 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data" (OuterVolumeSpecName: "config-data") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.428518 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b1ded96-8cc2-448f-8054-8f65f467ba9a" (UID: "9b1ded96-8cc2-448f-8054-8f65f467ba9a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.446576 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.448869 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.451041 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bc076227-def5-4f6f-8d73-5266e7237847" (UID: "bc076227-def5-4f6f-8d73-5266e7237847"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484503 4760 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484562 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484581 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484597 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484611 4760 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484626 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc076227-def5-4f6f-8d73-5266e7237847-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484643 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2adf9c9c-a451-4484-8630-d28b66e8e567-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.484656 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1ded96-8cc2-448f-8054-8f65f467ba9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.877501 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.880792 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc076227-def5-4f6f-8d73-5266e7237847","Type":"ContainerDied","Data":"e395d1eabefb893f68d82f2493bf5ca1e5e9278787a89461a5669f0f8cabb7fd"} Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.895647 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9b1ded96-8cc2-448f-8054-8f65f467ba9a","Type":"ContainerDied","Data":"03c6bb52da5200cd89e47ae105c0d34347a1c02c07e4e78e00069026f1458f88"} Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.895990 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.903513 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57zpb" event={"ID":"2adf9c9c-a451-4484-8630-d28b66e8e567","Type":"ContainerDied","Data":"c6e6bf5bd2fe229f27063a49f979e53bb698e5595a41365ddd6276b1a6e965c7"} Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.903598 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6e6bf5bd2fe229f27063a49f979e53bb698e5595a41365ddd6276b1a6e965c7" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.903770 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57zpb" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.948112 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-tvztj" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Dec 04 12:36:23 crc kubenswrapper[4760]: I1204 12:36:23.989091 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.011427 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.036334 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.076451 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.104083 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.104920 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="init" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.104948 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="init" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.104970 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.104978 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.104997 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105013 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.105029 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105037 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.105052 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105059 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.105077 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2adf9c9c-a451-4484-8630-d28b66e8e567" containerName="keystone-bootstrap" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105085 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2adf9c9c-a451-4484-8630-d28b66e8e567" containerName="keystone-bootstrap" Dec 04 12:36:24 crc kubenswrapper[4760]: E1204 12:36:24.105105 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="dnsmasq-dns" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105114 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="dnsmasq-dns" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105404 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f93fe9-3eaf-4ed7-b303-4a5f00eaa213" containerName="dnsmasq-dns" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105441 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105492 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105546 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2adf9c9c-a451-4484-8630-d28b66e8e567" containerName="keystone-bootstrap" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105564 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" containerName="glance-log" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.105578 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc076227-def5-4f6f-8d73-5266e7237847" containerName="glance-httpd" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.107264 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.111662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mvczn" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.112506 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.112781 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.113008 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.113479 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.129170 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.163463 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.167994 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.171575 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.174873 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.180030 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.213836 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.213927 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.213960 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frdn\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.213994 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214039 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214070 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214112 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214146 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214189 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9r6x\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214252 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214282 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214309 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214333 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214400 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214434 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.214496 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.316027 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.316750 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.316917 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.316982 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9r6x\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.317527 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318123 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318204 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318347 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318384 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318479 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318595 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2frdn\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318745 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318838 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.318882 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.319405 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.320585 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.322113 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.323811 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.325172 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.327034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.327141 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.337301 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.337480 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.337525 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.338189 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.338314 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.338617 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.349854 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.352550 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9r6x\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.360163 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.360523 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frdn\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.388311 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.391939 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.405127 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-57zpb"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.416579 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-57zpb"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.460998 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.485686 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8n9vw"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.490864 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.496381 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.496549 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnvh9" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.496402 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.497874 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.506335 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.512350 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.515453 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8n9vw"] Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.630052 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.630462 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.631520 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.631699 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.631981 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.632068 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98h8q\" (UniqueName: \"kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.734635 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.734755 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98h8q\" (UniqueName: \"kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.734830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.734889 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.734974 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.735088 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.746521 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.749162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.749658 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.760948 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.765021 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98h8q\" (UniqueName: \"kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.765537 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys\") pod \"keystone-bootstrap-8n9vw\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:24 crc kubenswrapper[4760]: I1204 12:36:24.820682 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:36:25 crc kubenswrapper[4760]: I1204 12:36:25.881944 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2adf9c9c-a451-4484-8630-d28b66e8e567" path="/var/lib/kubelet/pods/2adf9c9c-a451-4484-8630-d28b66e8e567/volumes" Dec 04 12:36:25 crc kubenswrapper[4760]: I1204 12:36:25.883574 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1ded96-8cc2-448f-8054-8f65f467ba9a" path="/var/lib/kubelet/pods/9b1ded96-8cc2-448f-8054-8f65f467ba9a/volumes" Dec 04 12:36:25 crc kubenswrapper[4760]: I1204 12:36:25.884410 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc076227-def5-4f6f-8d73-5266e7237847" path="/var/lib/kubelet/pods/bc076227-def5-4f6f-8d73-5266e7237847/volumes" Dec 04 12:36:30 crc kubenswrapper[4760]: E1204 12:36:30.919947 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 12:36:30 crc kubenswrapper[4760]: E1204 12:36:30.921241 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f6h548h674h87h696hb6h65dh78h558h68dh67hcch65chcbh99h56h549h68ch58h5fh89h7bhc4h546h554hb6h557hcbh8bh598hdch587q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqgfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-84489dfbd7-57pm6_openstack(a1e1a276-940e-45e7-b6b3-f9650cbd653c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:30 crc kubenswrapper[4760]: E1204 12:36:30.924443 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-84489dfbd7-57pm6" podUID="a1e1a276-940e-45e7-b6b3-f9650cbd653c" Dec 04 12:36:33 crc kubenswrapper[4760]: I1204 12:36:33.380937 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:36:33 crc kubenswrapper[4760]: I1204 12:36:33.382056 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:36:34 crc kubenswrapper[4760]: I1204 12:36:34.032910 4760 generic.go:334] "Generic (PLEG): container finished" podID="40fefe31-76d7-458b-b4ef-fb49320cbb18" containerID="5551e9a8c6bfbc4526afcd0cfd2904800d4cb09646ae96bf9e71ee6d7eab81e7" exitCode=0 Dec 04 12:36:34 crc kubenswrapper[4760]: I1204 12:36:34.032981 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jl9zh" event={"ID":"40fefe31-76d7-458b-b4ef-fb49320cbb18","Type":"ContainerDied","Data":"5551e9a8c6bfbc4526afcd0cfd2904800d4cb09646ae96bf9e71ee6d7eab81e7"} Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.645310 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.646316 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b5h689h5h5f4h5bh66dh655hd7h65h6bh677hc5h64dh66ch584h698h59ch5f4h656h64ch666h78h7fh654h695h66dh5dfh686h5fh655h544h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6flrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5b5586b4cc-jx6kr_openstack(eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.650188 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b5586b4cc-jx6kr" podUID="eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.652766 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.652976 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n84h597h665h575h96h574h5bh59fh584h576h578hfh64fh59dhc9h565h89h5dh554h5dfh585h5fdh576h5f7h568h65bh5bfh76h5bdh667h96h95q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvb88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6666859d7c-9bwlq_openstack(d9de2aa3-83f2-4701-b09e-00d0fab8403f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:45 crc kubenswrapper[4760]: E1204 12:36:45.655760 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6666859d7c-9bwlq" podUID="d9de2aa3-83f2-4701-b09e-00d0fab8403f" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.098783 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.099714 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cbh5fch5b6h96h649h5d6h665h8h5bdh544h56bh59h5c6h64ch65fh7ch559h648h5f9h589hf6h565h568h58ch6bh658h89h57dh59dh5f5h5cbh669q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4gtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(23335f60-d3db-4308-b1fe-a4603a8d65e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.109490 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.109726 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch65ch5d6hd8h57dhc6h5chf4hc8h9h64bh5fbh567h5fbh596h5dhf8hfbh698h87h546h678h649h5d5h87h5c8h66fhcfh65dh7dh58fh84q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2khq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5b7fc6c944-sh7tv_openstack(a6452e5d-5eb7-4d21-96ea-eefbc327f2f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.120509 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.120709 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.121041 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n544h8ch54fh5dbhcfh65bh55bh5c8h547hfbh76h6fh55fh557h578h676h5c6h5dh659h657hbdh588h75h67fh587h84hdbh598h679h569h67fh9fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8gbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-66f8fb5648-87dff_openstack(a1b21d48-8d8c-4c52-8be9-7a188fffa3cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.124391 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.219396 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.224888 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jl9zh" event={"ID":"40fefe31-76d7-458b-b4ef-fb49320cbb18","Type":"ContainerDied","Data":"b592ab8c7b05ec27c95fe65f39b780d0dc8ca5b9b50032a77bf61624fe002d73"} Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.224989 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b592ab8c7b05ec27c95fe65f39b780d0dc8ca5b9b50032a77bf61624fe002d73" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.228871 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84489dfbd7-57pm6" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.229167 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84489dfbd7-57pm6" event={"ID":"a1e1a276-940e-45e7-b6b3-f9650cbd653c","Type":"ContainerDied","Data":"6ac727b71cc3440df93474f32003fc5abcb801be5d8d86c23e3855c69b7c57ce"} Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.235277 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" Dec 04 12:36:46 crc kubenswrapper[4760]: E1204 12:36:46.235921 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.248550 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405425 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs\") pod \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405580 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkztl\" (UniqueName: \"kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl\") pod \"40fefe31-76d7-458b-b4ef-fb49320cbb18\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405718 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key\") pod \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405813 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqgfn\" (UniqueName: \"kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn\") pod \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405927 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts\") pod \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.405964 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle\") pod \"40fefe31-76d7-458b-b4ef-fb49320cbb18\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.407422 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config\") pod \"40fefe31-76d7-458b-b4ef-fb49320cbb18\" (UID: \"40fefe31-76d7-458b-b4ef-fb49320cbb18\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.407502 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data\") pod \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\" (UID: \"a1e1a276-940e-45e7-b6b3-f9650cbd653c\") " Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.408773 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs" (OuterVolumeSpecName: "logs") pod "a1e1a276-940e-45e7-b6b3-f9650cbd653c" (UID: "a1e1a276-940e-45e7-b6b3-f9650cbd653c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.409940 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts" (OuterVolumeSpecName: "scripts") pod "a1e1a276-940e-45e7-b6b3-f9650cbd653c" (UID: "a1e1a276-940e-45e7-b6b3-f9650cbd653c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.412068 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data" (OuterVolumeSpecName: "config-data") pod "a1e1a276-940e-45e7-b6b3-f9650cbd653c" (UID: "a1e1a276-940e-45e7-b6b3-f9650cbd653c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.415048 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl" (OuterVolumeSpecName: "kube-api-access-mkztl") pod "40fefe31-76d7-458b-b4ef-fb49320cbb18" (UID: "40fefe31-76d7-458b-b4ef-fb49320cbb18"). InnerVolumeSpecName "kube-api-access-mkztl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.434184 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn" (OuterVolumeSpecName: "kube-api-access-fqgfn") pod "a1e1a276-940e-45e7-b6b3-f9650cbd653c" (UID: "a1e1a276-940e-45e7-b6b3-f9650cbd653c"). InnerVolumeSpecName "kube-api-access-fqgfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.437409 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a1e1a276-940e-45e7-b6b3-f9650cbd653c" (UID: "a1e1a276-940e-45e7-b6b3-f9650cbd653c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.438185 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40fefe31-76d7-458b-b4ef-fb49320cbb18" (UID: "40fefe31-76d7-458b-b4ef-fb49320cbb18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.438483 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config" (OuterVolumeSpecName: "config") pod "40fefe31-76d7-458b-b4ef-fb49320cbb18" (UID: "40fefe31-76d7-458b-b4ef-fb49320cbb18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.511639 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1e1a276-940e-45e7-b6b3-f9650cbd653c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512126 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqgfn\" (UniqueName: \"kubernetes.io/projected/a1e1a276-940e-45e7-b6b3-f9650cbd653c-kube-api-access-fqgfn\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512177 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512195 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512239 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/40fefe31-76d7-458b-b4ef-fb49320cbb18-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512255 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1e1a276-940e-45e7-b6b3-f9650cbd653c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512267 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1e1a276-940e-45e7-b6b3-f9650cbd653c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.512280 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkztl\" (UniqueName: \"kubernetes.io/projected/40fefe31-76d7-458b-b4ef-fb49320cbb18-kube-api-access-mkztl\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.635109 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:46 crc kubenswrapper[4760]: I1204 12:36:46.656294 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84489dfbd7-57pm6"] Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.236943 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jl9zh" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.630185 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:36:47 crc kubenswrapper[4760]: E1204 12:36:47.633015 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fefe31-76d7-458b-b4ef-fb49320cbb18" containerName="neutron-db-sync" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.633053 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fefe31-76d7-458b-b4ef-fb49320cbb18" containerName="neutron-db-sync" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.633349 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fefe31-76d7-458b-b4ef-fb49320cbb18" containerName="neutron-db-sync" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.636923 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.679172 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747094 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747354 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747388 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcd5b\" (UniqueName: \"kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747550 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.747662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.779573 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.784312 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.792394 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.792773 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.792825 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.793142 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q74n6" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.804754 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850490 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcd5b\" (UniqueName: \"kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850905 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850956 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.850994 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.852147 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.853632 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.853683 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.853952 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.854040 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.880100 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcd5b\" (UniqueName: \"kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b\") pod \"dnsmasq-dns-55f844cf75-vt25n\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.887413 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e1a276-940e-45e7-b6b3-f9650cbd653c" path="/var/lib/kubelet/pods/a1e1a276-940e-45e7-b6b3-f9650cbd653c/volumes" Dec 04 12:36:47 crc kubenswrapper[4760]: E1204 12:36:47.951181 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 04 12:36:47 crc kubenswrapper[4760]: E1204 12:36:47.952047 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8gh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-46wx2_openstack(a8179a26-2281-4a5d-bc77-808a2f7e61bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:47 crc kubenswrapper[4760]: E1204 12:36:47.953240 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-46wx2" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.953605 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.953660 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.953836 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.955154 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxf2g\" (UniqueName: \"kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.955457 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:47 crc kubenswrapper[4760]: I1204 12:36:47.970137 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.058222 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxf2g\" (UniqueName: \"kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.058335 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.058392 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.058459 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.059464 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.064237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.064309 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.066121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.069659 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.089834 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxf2g\" (UniqueName: \"kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g\") pod \"neutron-f9db4bf7b-jdfxz\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: I1204 12:36:48.144045 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:48 crc kubenswrapper[4760]: E1204 12:36:48.249699 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-46wx2" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" Dec 04 12:36:48 crc kubenswrapper[4760]: E1204 12:36:48.669196 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Dec 04 12:36:48 crc kubenswrapper[4760]: E1204 12:36:48.669560 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fpr4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-rb22w_openstack(6e2d78cb-0c7a-408f-a736-6630b41bd80b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:48 crc kubenswrapper[4760]: E1204 12:36:48.670871 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-rb22w" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" Dec 04 12:36:49 crc kubenswrapper[4760]: E1204 12:36:49.260670 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-rb22w" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.260394 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-869c8d7d5c-srd5v"] Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.267015 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.270743 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.272366 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.275733 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869c8d7d5c-srd5v"] Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.319796 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kspwn\" (UniqueName: \"kubernetes.io/projected/c05ef76a-b809-4d3d-972c-2e5d2037b806-kube-api-access-kspwn\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.319957 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-internal-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.320020 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-httpd-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.320283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-ovndb-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.320439 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-combined-ca-bundle\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.320576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.320665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-public-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423504 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-ovndb-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423594 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-combined-ca-bundle\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423645 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423680 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-public-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423774 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kspwn\" (UniqueName: \"kubernetes.io/projected/c05ef76a-b809-4d3d-972c-2e5d2037b806-kube-api-access-kspwn\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423824 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-internal-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.423850 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-httpd-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.433046 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-public-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.433339 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-httpd-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.433151 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-combined-ca-bundle\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.433652 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-ovndb-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.435256 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-config\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.436864 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c05ef76a-b809-4d3d-972c-2e5d2037b806-internal-tls-certs\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.451519 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kspwn\" (UniqueName: \"kubernetes.io/projected/c05ef76a-b809-4d3d-972c-2e5d2037b806-kube-api-access-kspwn\") pod \"neutron-869c8d7d5c-srd5v\" (UID: \"c05ef76a-b809-4d3d-972c-2e5d2037b806\") " pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:50 crc kubenswrapper[4760]: I1204 12:36:50.597692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:52 crc kubenswrapper[4760]: E1204 12:36:52.566786 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.567424 4760 scope.go:117] "RemoveContainer" containerID="ad28aaa019017eb682b8f12a7dd1c734a8e583ad48922b74784c97de851d0a29" Dec 04 12:36:52 crc kubenswrapper[4760]: E1204 12:36:52.567450 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mc8gw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rqwnf_openstack(640263be-b424-4ed1-b0f5-d4b9907113e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:36:52 crc kubenswrapper[4760]: E1204 12:36:52.568528 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rqwnf" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.775502 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.824944 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894460 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6flrv\" (UniqueName: \"kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv\") pod \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894571 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data\") pod \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894650 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data\") pod \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894805 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvb88\" (UniqueName: \"kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88\") pod \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894833 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key\") pod \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894961 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts\") pod \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.894999 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs\") pod \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.895077 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts\") pod \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.895137 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs\") pod \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\" (UID: \"d9de2aa3-83f2-4701-b09e-00d0fab8403f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.895358 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key\") pod \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\" (UID: \"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f\") " Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.895716 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts" (OuterVolumeSpecName: "scripts") pod "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" (UID: "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.896004 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.896186 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data" (OuterVolumeSpecName: "config-data") pod "d9de2aa3-83f2-4701-b09e-00d0fab8403f" (UID: "d9de2aa3-83f2-4701-b09e-00d0fab8403f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.898021 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs" (OuterVolumeSpecName: "logs") pod "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" (UID: "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.898578 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data" (OuterVolumeSpecName: "config-data") pod "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" (UID: "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.901172 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs" (OuterVolumeSpecName: "logs") pod "d9de2aa3-83f2-4701-b09e-00d0fab8403f" (UID: "d9de2aa3-83f2-4701-b09e-00d0fab8403f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.901431 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts" (OuterVolumeSpecName: "scripts") pod "d9de2aa3-83f2-4701-b09e-00d0fab8403f" (UID: "d9de2aa3-83f2-4701-b09e-00d0fab8403f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.907341 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" (UID: "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.907504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv" (OuterVolumeSpecName: "kube-api-access-6flrv") pod "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" (UID: "eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f"). InnerVolumeSpecName "kube-api-access-6flrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.921525 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88" (OuterVolumeSpecName: "kube-api-access-qvb88") pod "d9de2aa3-83f2-4701-b09e-00d0fab8403f" (UID: "d9de2aa3-83f2-4701-b09e-00d0fab8403f"). InnerVolumeSpecName "kube-api-access-qvb88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.921571 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d9de2aa3-83f2-4701-b09e-00d0fab8403f" (UID: "d9de2aa3-83f2-4701-b09e-00d0fab8403f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998718 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998758 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvb88\" (UniqueName: \"kubernetes.io/projected/d9de2aa3-83f2-4701-b09e-00d0fab8403f-kube-api-access-qvb88\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998776 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9de2aa3-83f2-4701-b09e-00d0fab8403f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998789 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998836 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998849 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9de2aa3-83f2-4701-b09e-00d0fab8403f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998860 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998889 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6flrv\" (UniqueName: \"kubernetes.io/projected/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f-kube-api-access-6flrv\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:52 crc kubenswrapper[4760]: I1204 12:36:52.998933 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9de2aa3-83f2-4701-b09e-00d0fab8403f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.269807 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.288166 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8n9vw"] Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.305298 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6666859d7c-9bwlq" event={"ID":"d9de2aa3-83f2-4701-b09e-00d0fab8403f","Type":"ContainerDied","Data":"5e691e460f195b813b38c37ef98230d32f26d6b7759bf56a6ab0aab6f3067e70"} Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.305338 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6666859d7c-9bwlq" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.307580 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b5586b4cc-jx6kr" event={"ID":"eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f","Type":"ContainerDied","Data":"405048ade6f366e417999cd9cd10c78a9dae5bd1f871bce4b4bc291a40815d13"} Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.307651 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b5586b4cc-jx6kr" Dec 04 12:36:53 crc kubenswrapper[4760]: E1204 12:36:53.312526 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-rqwnf" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.417194 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.427567 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b5586b4cc-jx6kr"] Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.461383 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.474300 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:36:53 crc kubenswrapper[4760]: W1204 12:36:53.484754 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podead32194_7c87_4c05_99b6_55a928499e0d.slice/crio-5c15c9d697119aa5f59cd5f28946c8ab15964ac1d312186c94f6f322fbe1c040 WatchSource:0}: Error finding container 5c15c9d697119aa5f59cd5f28946c8ab15964ac1d312186c94f6f322fbe1c040: Status 404 returned error can't find the container with id 5c15c9d697119aa5f59cd5f28946c8ab15964ac1d312186c94f6f322fbe1c040 Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.487189 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6666859d7c-9bwlq"] Dec 04 12:36:53 crc kubenswrapper[4760]: W1204 12:36:53.543025 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16fa8aaa_4421_48a8_8b79_9d0780e6d04a.slice/crio-bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8 WatchSource:0}: Error finding container bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8: Status 404 returned error can't find the container with id bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8 Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.574583 4760 scope.go:117] "RemoveContainer" containerID="ed29677e02dfd161e1309c1fb649940d5c5587818c866ff6c21352261d77e5e7" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.859602 4760 scope.go:117] "RemoveContainer" containerID="127ca536165f2ec8ae2b3fdc19bd5f97d83682253f188aefab8f3815e316ce64" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.906590 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9de2aa3-83f2-4701-b09e-00d0fab8403f" path="/var/lib/kubelet/pods/d9de2aa3-83f2-4701-b09e-00d0fab8403f/volumes" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.907633 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f" path="/var/lib/kubelet/pods/eb14ef2a-1124-4b0f-9e0e-4772ae2b1e7f/volumes" Dec 04 12:36:53 crc kubenswrapper[4760]: I1204 12:36:53.938739 4760 scope.go:117] "RemoveContainer" containerID="fe3f7c1ab296fa50fb9b5fa7d663e62a408ed4f196559998ec7915953588d7b2" Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.032643 4760 scope.go:117] "RemoveContainer" containerID="c2eea84c49bbcd70070833e34e4eb146a61c5a55c94da1f30c54c7e7d322bf3c" Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.229400 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.289767 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:36:54 crc kubenswrapper[4760]: W1204 12:36:54.329409 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94d936ad_6af2_4f94_8d9e_0111032b5cad.slice/crio-9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb WatchSource:0}: Error finding container 9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb: Status 404 returned error can't find the container with id 9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.377397 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8n9vw" event={"ID":"30fe1729-73ac-43f6-bc34-b335da33a7e6","Type":"ContainerStarted","Data":"102b444d9290b954810a48ad0f3ee29b6f9730f44a5ede18f183bb8f908f06c3"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.377573 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8n9vw" event={"ID":"30fe1729-73ac-43f6-bc34-b335da33a7e6","Type":"ContainerStarted","Data":"de5cdf02392972e5a562642e42c4c0282ba6d72c0a00c6335fa9781f1451ac01"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.386461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerStarted","Data":"5c15c9d697119aa5f59cd5f28946c8ab15964ac1d312186c94f6f322fbe1c040"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.408126 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nmzx2" event={"ID":"fe6ec201-ccbb-4003-9893-13b6656a1624","Type":"ContainerStarted","Data":"5c5493bd19b539ac81e35141045a228554df10121d25d9035c912ffed092266f"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.418786 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8n9vw" podStartSLOduration=30.418749327 podStartE2EDuration="30.418749327s" podCreationTimestamp="2025-12-04 12:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:54.409243835 +0000 UTC m=+1417.450690432" watchObservedRunningTime="2025-12-04 12:36:54.418749327 +0000 UTC m=+1417.460195894" Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.437633 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerStarted","Data":"bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.451838 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerStarted","Data":"5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.454081 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" event={"ID":"d085487a-6288-4d8c-87b0-11b7924141f4","Type":"ContainerStarted","Data":"96b9ed8efc3f0f69ebf57bb6fe629fb748af00059600c674abe5f071d2d02975"} Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.465715 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869c8d7d5c-srd5v"] Dec 04 12:36:54 crc kubenswrapper[4760]: I1204 12:36:54.480851 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nmzx2" podStartSLOduration=6.915157828 podStartE2EDuration="53.480797547s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="2025-12-04 12:36:05.971616104 +0000 UTC m=+1369.013062671" lastFinishedPulling="2025-12-04 12:36:52.537255823 +0000 UTC m=+1415.578702390" observedRunningTime="2025-12-04 12:36:54.43332755 +0000 UTC m=+1417.474774127" watchObservedRunningTime="2025-12-04 12:36:54.480797547 +0000 UTC m=+1417.522244114" Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.493122 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerStarted","Data":"397a43da3613c2b526c626e5e6731e03259fc8d31bd1daac41d3f811bb234537"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.496975 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerStarted","Data":"687679c5e5a0e7a203ada44ea9f444017276f4c4fbac1d624bf04a251f7494e0"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.501076 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerStarted","Data":"cc4bb714b5e43dd8d68e6969cad10449ea5df26f89281f0554ab6248963395e2"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.501130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerStarted","Data":"9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.522047 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869c8d7d5c-srd5v" event={"ID":"c05ef76a-b809-4d3d-972c-2e5d2037b806","Type":"ContainerStarted","Data":"89a3ca02c18b92af61e80846659b22e0324ceedaedf71d562ce9b479f61a72fb"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.522115 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869c8d7d5c-srd5v" event={"ID":"c05ef76a-b809-4d3d-972c-2e5d2037b806","Type":"ContainerStarted","Data":"e5e90606324b1609f480236b56033a1885fed7e6e75b6a35ea018889aecadd76"} Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.532765 4760 generic.go:334] "Generic (PLEG): container finished" podID="d085487a-6288-4d8c-87b0-11b7924141f4" containerID="b60f07b58036e539ad0748a38a481f6bd0cc9b488bffc6df45e4dd38d0e2a807" exitCode=0 Dec 04 12:36:55 crc kubenswrapper[4760]: I1204 12:36:55.534331 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" event={"ID":"d085487a-6288-4d8c-87b0-11b7924141f4","Type":"ContainerDied","Data":"b60f07b58036e539ad0748a38a481f6bd0cc9b488bffc6df45e4dd38d0e2a807"} Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.570322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869c8d7d5c-srd5v" event={"ID":"c05ef76a-b809-4d3d-972c-2e5d2037b806","Type":"ContainerStarted","Data":"64e2ba6ac5190e82905cbc5d6653297bfe9933d73f332114654d799936367221"} Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.570832 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.580783 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" event={"ID":"d085487a-6288-4d8c-87b0-11b7924141f4","Type":"ContainerStarted","Data":"0c104678c9e693c103e49cb8df74e189aefcc30b8ceeccc2ff4d0876a9b439d4"} Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.584527 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.589419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerStarted","Data":"b4890983a20fcaf634982b513116b99dbc34aa74525b3efaf48e5a5d4349a66b"} Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.600889 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerStarted","Data":"7d02f4faa6170b88d08cadd8e1179e5e036635971e3b0544bf1f05777d23a70c"} Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.604316 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.622691 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-869c8d7d5c-srd5v" podStartSLOduration=6.622659506 podStartE2EDuration="6.622659506s" podCreationTimestamp="2025-12-04 12:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:56.622050477 +0000 UTC m=+1419.663497054" watchObservedRunningTime="2025-12-04 12:36:56.622659506 +0000 UTC m=+1419.664106073" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.667370 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=33.667288793 podStartE2EDuration="33.667288793s" podCreationTimestamp="2025-12-04 12:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:56.658152063 +0000 UTC m=+1419.699598750" watchObservedRunningTime="2025-12-04 12:36:56.667288793 +0000 UTC m=+1419.708735360" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.693330 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" podStartSLOduration=9.693299128 podStartE2EDuration="9.693299128s" podCreationTimestamp="2025-12-04 12:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:56.686822293 +0000 UTC m=+1419.728268860" watchObservedRunningTime="2025-12-04 12:36:56.693299128 +0000 UTC m=+1419.734745695" Dec 04 12:36:56 crc kubenswrapper[4760]: I1204 12:36:56.733786 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f9db4bf7b-jdfxz" podStartSLOduration=9.733751952 podStartE2EDuration="9.733751952s" podCreationTimestamp="2025-12-04 12:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:56.714978157 +0000 UTC m=+1419.756424724" watchObservedRunningTime="2025-12-04 12:36:56.733751952 +0000 UTC m=+1419.775198519" Dec 04 12:36:57 crc kubenswrapper[4760]: I1204 12:36:57.630942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerStarted","Data":"0633f872cc5171c179e3df911714690342d38a84f061229b5c5387033035ac80"} Dec 04 12:36:57 crc kubenswrapper[4760]: I1204 12:36:57.675915 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.67588015 podStartE2EDuration="34.67588015s" podCreationTimestamp="2025-12-04 12:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:36:57.654529222 +0000 UTC m=+1420.695975789" watchObservedRunningTime="2025-12-04 12:36:57.67588015 +0000 UTC m=+1420.717326717" Dec 04 12:37:02 crc kubenswrapper[4760]: I1204 12:37:02.971403 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.064906 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.065590 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" containerID="cri-o://5db3f6150ba5d584cfeef7ded8abacea57ed7ed3720f0750a2fdd96735285378" gracePeriod=10 Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.380376 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.380511 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.380573 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.381694 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:37:03 crc kubenswrapper[4760]: I1204 12:37:03.381777 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf" gracePeriod=600 Dec 04 12:37:04 crc kubenswrapper[4760]: I1204 12:37:04.242755 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 04 12:37:04 crc kubenswrapper[4760]: I1204 12:37:04.463240 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:04 crc kubenswrapper[4760]: I1204 12:37:04.463309 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:04 crc kubenswrapper[4760]: I1204 12:37:04.513525 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 12:37:04 crc kubenswrapper[4760]: I1204 12:37:04.513603 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.076507 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.077558 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.079676 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.091441 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.091930 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.100177 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.731261 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:05 crc kubenswrapper[4760]: I1204 12:37:05.731334 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.750071 4760 generic.go:334] "Generic (PLEG): container finished" podID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerID="5db3f6150ba5d584cfeef7ded8abacea57ed7ed3720f0750a2fdd96735285378" exitCode=0 Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.750491 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" event={"ID":"dcf8aac0-0b5c-4170-b849-374f1f4fd65c","Type":"ContainerDied","Data":"5db3f6150ba5d584cfeef7ded8abacea57ed7ed3720f0750a2fdd96735285378"} Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.752631 4760 generic.go:334] "Generic (PLEG): container finished" podID="30fe1729-73ac-43f6-bc34-b335da33a7e6" containerID="102b444d9290b954810a48ad0f3ee29b6f9730f44a5ede18f183bb8f908f06c3" exitCode=0 Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.752657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8n9vw" event={"ID":"30fe1729-73ac-43f6-bc34-b335da33a7e6","Type":"ContainerDied","Data":"102b444d9290b954810a48ad0f3ee29b6f9730f44a5ede18f183bb8f908f06c3"} Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.754960 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf" exitCode=0 Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.755053 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf"} Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.755151 4760 scope.go:117] "RemoveContainer" containerID="bbb8ff1383b54b37d35a08dd354725d1bf3d8a55864345be2ff083742830474e" Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.755085 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:37:06 crc kubenswrapper[4760]: I1204 12:37:06.755290 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.037179 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.038180 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.040774 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.060222 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.060367 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.078400 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 12:37:09 crc kubenswrapper[4760]: I1204 12:37:09.247336 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Dec 04 12:37:10 crc kubenswrapper[4760]: E1204 12:37:10.451460 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Dec 04 12:37:10 crc kubenswrapper[4760]: E1204 12:37:10.451946 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4gtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(23335f60-d3db-4308-b1fe-a4603a8d65e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.545195 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.653571 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.732948 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.733054 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98h8q\" (UniqueName: \"kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.733080 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.733222 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.733280 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.733528 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle\") pod \"30fe1729-73ac-43f6-bc34-b335da33a7e6\" (UID: \"30fe1729-73ac-43f6-bc34-b335da33a7e6\") " Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.746821 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q" (OuterVolumeSpecName: "kube-api-access-98h8q") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "kube-api-access-98h8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.747298 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts" (OuterVolumeSpecName: "scripts") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.789861 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.790998 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.821887 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data" (OuterVolumeSpecName: "config-data") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.837125 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.837491 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98h8q\" (UniqueName: \"kubernetes.io/projected/30fe1729-73ac-43f6-bc34-b335da33a7e6-kube-api-access-98h8q\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.837507 4760 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.837517 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.837526 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.878581 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30fe1729-73ac-43f6-bc34-b335da33a7e6" (UID: "30fe1729-73ac-43f6-bc34-b335da33a7e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:10 crc kubenswrapper[4760]: I1204 12:37:10.969922 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fe1729-73ac-43f6-bc34-b335da33a7e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.011650 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8n9vw" event={"ID":"30fe1729-73ac-43f6-bc34-b335da33a7e6","Type":"ContainerDied","Data":"de5cdf02392972e5a562642e42c4c0282ba6d72c0a00c6335fa9781f1451ac01"} Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.011709 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5cdf02392972e5a562642e42c4c0282ba6d72c0a00c6335fa9781f1451ac01" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.011793 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8n9vw" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.151091 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285233 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285380 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285508 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285619 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh7cl\" (UniqueName: \"kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.285686 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.293523 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl" (OuterVolumeSpecName: "kube-api-access-dh7cl") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "kube-api-access-dh7cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.369740 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.387482 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh7cl\" (UniqueName: \"kubernetes.io/projected/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-kube-api-access-dh7cl\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.387521 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.414155 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.434423 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.492822 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config" (OuterVolumeSpecName: "config") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.493306 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") pod \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\" (UID: \"dcf8aac0-0b5c-4170-b849-374f1f4fd65c\") " Dec 04 12:37:11 crc kubenswrapper[4760]: W1204 12:37:11.497577 4760 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dcf8aac0-0b5c-4170-b849-374f1f4fd65c/volumes/kubernetes.io~configmap/config Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.497623 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config" (OuterVolumeSpecName: "config") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.498655 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.498698 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.498715 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.516834 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dcf8aac0-0b5c-4170-b849-374f1f4fd65c" (UID: "dcf8aac0-0b5c-4170-b849-374f1f4fd65c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.600627 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dcf8aac0-0b5c-4170-b849-374f1f4fd65c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.772981 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d74ff5d57-x29r6"] Dec 04 12:37:11 crc kubenswrapper[4760]: E1204 12:37:11.774436 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="init" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.774469 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="init" Dec 04 12:37:11 crc kubenswrapper[4760]: E1204 12:37:11.774517 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.774533 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" Dec 04 12:37:11 crc kubenswrapper[4760]: E1204 12:37:11.774582 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fe1729-73ac-43f6-bc34-b335da33a7e6" containerName="keystone-bootstrap" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.774598 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fe1729-73ac-43f6-bc34-b335da33a7e6" containerName="keystone-bootstrap" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.774970 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fe1729-73ac-43f6-bc34-b335da33a7e6" containerName="keystone-bootstrap" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.775016 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" containerName="dnsmasq-dns" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.776540 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860316 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7867w\" (UniqueName: \"kubernetes.io/projected/da835318-50f2-43af-9988-bad83a5ee42c-kube-api-access-7867w\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860434 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-scripts\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-config-data\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860508 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-credential-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860539 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-fernet-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860589 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-combined-ca-bundle\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860634 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-internal-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.860669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-public-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.866234 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.866613 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.866684 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.866779 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.866684 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnvh9" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.870541 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.914468 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d74ff5d57-x29r6"] Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964006 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-combined-ca-bundle\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964100 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-internal-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964133 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-public-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964246 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7867w\" (UniqueName: \"kubernetes.io/projected/da835318-50f2-43af-9988-bad83a5ee42c-kube-api-access-7867w\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964307 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-scripts\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964332 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-config-data\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964388 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-credential-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.964423 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-fernet-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.977248 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-config-data\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.980656 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-public-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.981325 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-combined-ca-bundle\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.981482 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-scripts\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.986931 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-fernet-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.990875 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-credential-keys\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:11 crc kubenswrapper[4760]: I1204 12:37:11.991257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da835318-50f2-43af-9988-bad83a5ee42c-internal-tls-certs\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.008516 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7867w\" (UniqueName: \"kubernetes.io/projected/da835318-50f2-43af-9988-bad83a5ee42c-kube-api-access-7867w\") pod \"keystone-d74ff5d57-x29r6\" (UID: \"da835318-50f2-43af-9988-bad83a5ee42c\") " pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.047584 4760 generic.go:334] "Generic (PLEG): container finished" podID="fe6ec201-ccbb-4003-9893-13b6656a1624" containerID="5c5493bd19b539ac81e35141045a228554df10121d25d9035c912ffed092266f" exitCode=0 Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.047843 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nmzx2" event={"ID":"fe6ec201-ccbb-4003-9893-13b6656a1624","Type":"ContainerDied","Data":"5c5493bd19b539ac81e35141045a228554df10121d25d9035c912ffed092266f"} Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.060162 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a"} Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.091878 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" event={"ID":"dcf8aac0-0b5c-4170-b849-374f1f4fd65c","Type":"ContainerDied","Data":"d3542bc6231416c72f6ff5158c3d9f8d4512e7462265a6ce68e0b0d6cff0edcb"} Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.091966 4760 scope.go:117] "RemoveContainer" containerID="5db3f6150ba5d584cfeef7ded8abacea57ed7ed3720f0750a2fdd96735285378" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.092148 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-d5x4k" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.157829 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.213001 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.227379 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-d5x4k"] Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.266550 4760 scope.go:117] "RemoveContainer" containerID="886a3015c4461c6a8f271dcda624ec4e3465b66918e422e487241938df9f2f01" Dec 04 12:37:12 crc kubenswrapper[4760]: I1204 12:37:12.814848 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d74ff5d57-x29r6"] Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.166734 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerStarted","Data":"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.193066 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d74ff5d57-x29r6" event={"ID":"da835318-50f2-43af-9988-bad83a5ee42c","Type":"ContainerStarted","Data":"927c711b360c332c5d9a20e52c787afcca16c3527a5ae136afdd06113c6a53d9"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.256682 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rqwnf" event={"ID":"640263be-b424-4ed1-b0f5-d4b9907113e2","Type":"ContainerStarted","Data":"530aa0c9feea88fddde13fdf57142533cbbd84649730112263053ace7b0beff8"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.279783 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-46wx2" event={"ID":"a8179a26-2281-4a5d-bc77-808a2f7e61bb","Type":"ContainerStarted","Data":"437b9589ac1c40aef31a422b9ee2eda92a071d0f18efbaf73fcf51148230d821"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.294287 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rb22w" event={"ID":"6e2d78cb-0c7a-408f-a736-6630b41bd80b","Type":"ContainerStarted","Data":"5ba953f8ee30f593c2b31c939e252e989f85ce1c744c6b0bda4c4dd9dfefa0c1"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.300186 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rqwnf" podStartSLOduration=6.967698396 podStartE2EDuration="1m12.300153032s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="2025-12-04 12:36:06.009122595 +0000 UTC m=+1369.050569172" lastFinishedPulling="2025-12-04 12:37:11.341577241 +0000 UTC m=+1434.383023808" observedRunningTime="2025-12-04 12:37:13.300010048 +0000 UTC m=+1436.341456615" watchObservedRunningTime="2025-12-04 12:37:13.300153032 +0000 UTC m=+1436.341599609" Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.318539 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerStarted","Data":"85667211e1e8e14a683abdb335b198f94d2f3b3e02903230598d78fc653e800b"} Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.377390 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-rb22w" podStartSLOduration=6.637624828 podStartE2EDuration="1m12.377354233s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="2025-12-04 12:36:05.596760835 +0000 UTC m=+1368.638207402" lastFinishedPulling="2025-12-04 12:37:11.33649024 +0000 UTC m=+1434.377936807" observedRunningTime="2025-12-04 12:37:13.337579521 +0000 UTC m=+1436.379026088" watchObservedRunningTime="2025-12-04 12:37:13.377354233 +0000 UTC m=+1436.418800800" Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.389080 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-46wx2" podStartSLOduration=5.094171734 podStartE2EDuration="1m12.389034574s" podCreationTimestamp="2025-12-04 12:36:01 +0000 UTC" firstStartedPulling="2025-12-04 12:36:04.065117525 +0000 UTC m=+1367.106564092" lastFinishedPulling="2025-12-04 12:37:11.359980365 +0000 UTC m=+1434.401426932" observedRunningTime="2025-12-04 12:37:13.369898446 +0000 UTC m=+1436.411345013" watchObservedRunningTime="2025-12-04 12:37:13.389034574 +0000 UTC m=+1436.430481141" Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.888715 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf8aac0-0b5c-4170-b849-374f1f4fd65c" path="/var/lib/kubelet/pods/dcf8aac0-0b5c-4170-b849-374f1f4fd65c/volumes" Dec 04 12:37:13 crc kubenswrapper[4760]: I1204 12:37:13.900734 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nmzx2" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.050151 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data\") pod \"fe6ec201-ccbb-4003-9893-13b6656a1624\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.050286 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx4nv\" (UniqueName: \"kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv\") pod \"fe6ec201-ccbb-4003-9893-13b6656a1624\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.050348 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle\") pod \"fe6ec201-ccbb-4003-9893-13b6656a1624\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.050400 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs\") pod \"fe6ec201-ccbb-4003-9893-13b6656a1624\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.050455 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts\") pod \"fe6ec201-ccbb-4003-9893-13b6656a1624\" (UID: \"fe6ec201-ccbb-4003-9893-13b6656a1624\") " Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.055505 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs" (OuterVolumeSpecName: "logs") pod "fe6ec201-ccbb-4003-9893-13b6656a1624" (UID: "fe6ec201-ccbb-4003-9893-13b6656a1624"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.070448 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts" (OuterVolumeSpecName: "scripts") pod "fe6ec201-ccbb-4003-9893-13b6656a1624" (UID: "fe6ec201-ccbb-4003-9893-13b6656a1624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.072461 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv" (OuterVolumeSpecName: "kube-api-access-bx4nv") pod "fe6ec201-ccbb-4003-9893-13b6656a1624" (UID: "fe6ec201-ccbb-4003-9893-13b6656a1624"). InnerVolumeSpecName "kube-api-access-bx4nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.110499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data" (OuterVolumeSpecName: "config-data") pod "fe6ec201-ccbb-4003-9893-13b6656a1624" (UID: "fe6ec201-ccbb-4003-9893-13b6656a1624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.131405 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6ec201-ccbb-4003-9893-13b6656a1624" (UID: "fe6ec201-ccbb-4003-9893-13b6656a1624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.155575 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.155645 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx4nv\" (UniqueName: \"kubernetes.io/projected/fe6ec201-ccbb-4003-9893-13b6656a1624-kube-api-access-bx4nv\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.155665 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.155678 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe6ec201-ccbb-4003-9893-13b6656a1624-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.155692 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ec201-ccbb-4003-9893-13b6656a1624-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.289857 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6999bbbcbb-mwnkr"] Dec 04 12:37:14 crc kubenswrapper[4760]: E1204 12:37:14.290706 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ec201-ccbb-4003-9893-13b6656a1624" containerName="placement-db-sync" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.290741 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ec201-ccbb-4003-9893-13b6656a1624" containerName="placement-db-sync" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.291094 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ec201-ccbb-4003-9893-13b6656a1624" containerName="placement-db-sync" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.292836 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.306343 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.306659 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.362938 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-scripts\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363028 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-config-data\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363067 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndbf\" (UniqueName: \"kubernetes.io/projected/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-kube-api-access-lndbf\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363094 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-internal-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363173 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-logs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363235 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-combined-ca-bundle\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.363266 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-public-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.367484 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6999bbbcbb-mwnkr"] Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.382395 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nmzx2" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.384024 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nmzx2" event={"ID":"fe6ec201-ccbb-4003-9893-13b6656a1624","Type":"ContainerDied","Data":"6879120be722a47613daf71b70225d46f2614d279e96368790215906062a7ca2"} Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.384077 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6879120be722a47613daf71b70225d46f2614d279e96368790215906062a7ca2" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.399183 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerStarted","Data":"7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c"} Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.420514 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerStarted","Data":"850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7"} Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.439090 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d74ff5d57-x29r6" event={"ID":"da835318-50f2-43af-9988-bad83a5ee42c","Type":"ContainerStarted","Data":"c0eeb83719f3bd31f7ae7d1e015c7a5d179c0cae87603d8e1762431a0a13a3f2"} Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.439504 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.458822 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b7fc6c944-sh7tv" podStartSLOduration=10.12262265 podStartE2EDuration="1m1.458786972s" podCreationTimestamp="2025-12-04 12:36:13 +0000 UTC" firstStartedPulling="2025-12-04 12:36:19.920797773 +0000 UTC m=+1382.962244340" lastFinishedPulling="2025-12-04 12:37:11.256962095 +0000 UTC m=+1434.298408662" observedRunningTime="2025-12-04 12:37:14.444640893 +0000 UTC m=+1437.486087460" watchObservedRunningTime="2025-12-04 12:37:14.458786972 +0000 UTC m=+1437.500233539" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.464846 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-logs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.464941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-combined-ca-bundle\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.464974 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-public-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.465050 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-scripts\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.465106 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-config-data\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.465142 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndbf\" (UniqueName: \"kubernetes.io/projected/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-kube-api-access-lndbf\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.465174 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-internal-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.466852 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-logs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.473162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-public-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.475634 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-combined-ca-bundle\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.477055 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-config-data\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.477854 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-scripts\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.483144 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-internal-tls-certs\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.498365 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66f8fb5648-87dff" podStartSLOduration=10.162075351 podStartE2EDuration="1m1.498330386s" podCreationTimestamp="2025-12-04 12:36:13 +0000 UTC" firstStartedPulling="2025-12-04 12:36:19.922178338 +0000 UTC m=+1382.963624905" lastFinishedPulling="2025-12-04 12:37:11.258433373 +0000 UTC m=+1434.299879940" observedRunningTime="2025-12-04 12:37:14.483289249 +0000 UTC m=+1437.524735806" watchObservedRunningTime="2025-12-04 12:37:14.498330386 +0000 UTC m=+1437.539776963" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.514580 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndbf\" (UniqueName: \"kubernetes.io/projected/72a9e917-ec75-4b75-a7db-ca42c3e8d1f5-kube-api-access-lndbf\") pod \"placement-6999bbbcbb-mwnkr\" (UID: \"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5\") " pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.523785 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d74ff5d57-x29r6" podStartSLOduration=3.523680081 podStartE2EDuration="3.523680081s" podCreationTimestamp="2025-12-04 12:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:14.505823135 +0000 UTC m=+1437.547269702" watchObservedRunningTime="2025-12-04 12:37:14.523680081 +0000 UTC m=+1437.565126658" Dec 04 12:37:14 crc kubenswrapper[4760]: I1204 12:37:14.621022 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:15 crc kubenswrapper[4760]: I1204 12:37:15.264057 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6999bbbcbb-mwnkr"] Dec 04 12:37:15 crc kubenswrapper[4760]: I1204 12:37:15.477142 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6999bbbcbb-mwnkr" event={"ID":"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5","Type":"ContainerStarted","Data":"f178fa8cfcf2d8a4013f9bb5f32e5423278616723f04102da4d8985a91b8740e"} Dec 04 12:37:16 crc kubenswrapper[4760]: I1204 12:37:16.527559 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6999bbbcbb-mwnkr" event={"ID":"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5","Type":"ContainerStarted","Data":"89eb3ba3539c2be4cb87618c1102ba6846839177accd7e180ec3ca5849ae0b38"} Dec 04 12:37:16 crc kubenswrapper[4760]: I1204 12:37:16.528170 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6999bbbcbb-mwnkr" event={"ID":"72a9e917-ec75-4b75-a7db-ca42c3e8d1f5","Type":"ContainerStarted","Data":"52f7684638c0121890fe00d2a139192b9de026e9308c8fe97efa6953af3daf7f"} Dec 04 12:37:16 crc kubenswrapper[4760]: I1204 12:37:16.528251 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:16 crc kubenswrapper[4760]: I1204 12:37:16.528295 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:16 crc kubenswrapper[4760]: I1204 12:37:16.565174 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6999bbbcbb-mwnkr" podStartSLOduration=2.565147503 podStartE2EDuration="2.565147503s" podCreationTimestamp="2025-12-04 12:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:16.56473958 +0000 UTC m=+1439.606186157" watchObservedRunningTime="2025-12-04 12:37:16.565147503 +0000 UTC m=+1439.606594070" Dec 04 12:37:18 crc kubenswrapper[4760]: I1204 12:37:18.153329 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-f9db4bf7b-jdfxz" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:18 crc kubenswrapper[4760]: I1204 12:37:18.157169 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-f9db4bf7b-jdfxz" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:18 crc kubenswrapper[4760]: I1204 12:37:18.159259 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-f9db4bf7b-jdfxz" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:20 crc kubenswrapper[4760]: I1204 12:37:20.598381 4760 generic.go:334] "Generic (PLEG): container finished" podID="640263be-b424-4ed1-b0f5-d4b9907113e2" containerID="530aa0c9feea88fddde13fdf57142533cbbd84649730112263053ace7b0beff8" exitCode=0 Dec 04 12:37:20 crc kubenswrapper[4760]: I1204 12:37:20.598845 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rqwnf" event={"ID":"640263be-b424-4ed1-b0f5-d4b9907113e2","Type":"ContainerDied","Data":"530aa0c9feea88fddde13fdf57142533cbbd84649730112263053ace7b0beff8"} Dec 04 12:37:20 crc kubenswrapper[4760]: I1204 12:37:20.618112 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-869c8d7d5c-srd5v" podUID="c05ef76a-b809-4d3d-972c-2e5d2037b806" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:20 crc kubenswrapper[4760]: I1204 12:37:20.620495 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-869c8d7d5c-srd5v" podUID="c05ef76a-b809-4d3d-972c-2e5d2037b806" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:20 crc kubenswrapper[4760]: I1204 12:37:20.621801 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-869c8d7d5c-srd5v" podUID="c05ef76a-b809-4d3d-972c-2e5d2037b806" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:37:23 crc kubenswrapper[4760]: I1204 12:37:23.663496 4760 generic.go:334] "Generic (PLEG): container finished" podID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" containerID="437b9589ac1c40aef31a422b9ee2eda92a071d0f18efbaf73fcf51148230d821" exitCode=0 Dec 04 12:37:23 crc kubenswrapper[4760]: I1204 12:37:23.663839 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-46wx2" event={"ID":"a8179a26-2281-4a5d-bc77-808a2f7e61bb","Type":"ContainerDied","Data":"437b9589ac1c40aef31a422b9ee2eda92a071d0f18efbaf73fcf51148230d821"} Dec 04 12:37:23 crc kubenswrapper[4760]: I1204 12:37:23.913054 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:37:23 crc kubenswrapper[4760]: I1204 12:37:23.913137 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.177610 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.177691 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.188414 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.251697 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.370828 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc8gw\" (UniqueName: \"kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw\") pod \"640263be-b424-4ed1-b0f5-d4b9907113e2\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.370939 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data\") pod \"640263be-b424-4ed1-b0f5-d4b9907113e2\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.371015 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle\") pod \"640263be-b424-4ed1-b0f5-d4b9907113e2\" (UID: \"640263be-b424-4ed1-b0f5-d4b9907113e2\") " Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.381454 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "640263be-b424-4ed1-b0f5-d4b9907113e2" (UID: "640263be-b424-4ed1-b0f5-d4b9907113e2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.381645 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw" (OuterVolumeSpecName: "kube-api-access-mc8gw") pod "640263be-b424-4ed1-b0f5-d4b9907113e2" (UID: "640263be-b424-4ed1-b0f5-d4b9907113e2"). InnerVolumeSpecName "kube-api-access-mc8gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.424897 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "640263be-b424-4ed1-b0f5-d4b9907113e2" (UID: "640263be-b424-4ed1-b0f5-d4b9907113e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.472537 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc8gw\" (UniqueName: \"kubernetes.io/projected/640263be-b424-4ed1-b0f5-d4b9907113e2-kube-api-access-mc8gw\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.472583 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.472595 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640263be-b424-4ed1-b0f5-d4b9907113e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.701979 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rqwnf" Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.702185 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rqwnf" event={"ID":"640263be-b424-4ed1-b0f5-d4b9907113e2","Type":"ContainerDied","Data":"de9710224142192f001fd11b973c93d999b81e9f217b5b158d1e93c4a8c90f7e"} Dec 04 12:37:24 crc kubenswrapper[4760]: I1204 12:37:24.703483 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9710224142192f001fd11b973c93d999b81e9f217b5b158d1e93c4a8c90f7e" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.056397 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-46wx2" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.193100 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8gh5\" (UniqueName: \"kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194427 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194532 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194552 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194660 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.194798 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data\") pod \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\" (UID: \"a8179a26-2281-4a5d-bc77-808a2f7e61bb\") " Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.195525 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a8179a26-2281-4a5d-bc77-808a2f7e61bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.199560 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5" (OuterVolumeSpecName: "kube-api-access-z8gh5") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "kube-api-access-z8gh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.201793 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.202186 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts" (OuterVolumeSpecName: "scripts") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.275984 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.297375 4760 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.297443 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8gh5\" (UniqueName: \"kubernetes.io/projected/a8179a26-2281-4a5d-bc77-808a2f7e61bb-kube-api-access-z8gh5\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.297459 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.297472 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.319898 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data" (OuterVolumeSpecName: "config-data") pod "a8179a26-2281-4a5d-bc77-808a2f7e61bb" (UID: "a8179a26-2281-4a5d-bc77-808a2f7e61bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.400506 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8179a26-2281-4a5d-bc77-808a2f7e61bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:25 crc kubenswrapper[4760]: E1204 12:37:25.596001 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.618588 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-795464b486-tr8rf"] Dec 04 12:37:25 crc kubenswrapper[4760]: E1204 12:37:25.619226 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" containerName="cinder-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.619244 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" containerName="cinder-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: E1204 12:37:25.619273 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" containerName="barbican-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.619280 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" containerName="barbican-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.619464 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" containerName="cinder-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.619488 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" containerName="barbican-db-sync" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.620877 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.644396 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.644583 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.672522 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-wjwwq" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.679308 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-577684486f-vqq72"] Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.681665 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.695037 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-795464b486-tr8rf"] Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.708982 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.742717 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-46wx2" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.743597 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-46wx2" event={"ID":"a8179a26-2281-4a5d-bc77-808a2f7e61bb","Type":"ContainerDied","Data":"039259f79253606608032edf8d6888a8f357bb8e2e6a9242ad9f050e2792991c"} Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.743662 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="039259f79253606608032edf8d6888a8f357bb8e2e6a9242ad9f050e2792991c" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.771720 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerStarted","Data":"6cd4b7442c5d8c26b0f74bce2e01f566fd68b0161b693e247c6a5f2f6624bc3d"} Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.772101 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="ceilometer-notification-agent" containerID="cri-o://5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811" gracePeriod=30 Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.775220 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="proxy-httpd" containerID="cri-o://6cd4b7442c5d8c26b0f74bce2e01f566fd68b0161b693e247c6a5f2f6624bc3d" gracePeriod=30 Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.776149 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.813341 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-577684486f-vqq72"] Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.814968 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ee4654-5dd5-4c14-9985-1037a884e4b7-logs\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815199 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-logs\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815428 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815469 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data-custom\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815508 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brrfv\" (UniqueName: \"kubernetes.io/projected/e6ee4654-5dd5-4c14-9985-1037a884e4b7-kube-api-access-brrfv\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815564 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlrv\" (UniqueName: \"kubernetes.io/projected/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-kube-api-access-5rlrv\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815618 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data-custom\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815644 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-combined-ca-bundle\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:25 crc kubenswrapper[4760]: I1204 12:37:25.815667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-combined-ca-bundle\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.087528 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088271 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ee4654-5dd5-4c14-9985-1037a884e4b7-logs\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088640 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-logs\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088683 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data-custom\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088735 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brrfv\" (UniqueName: \"kubernetes.io/projected/e6ee4654-5dd5-4c14-9985-1037a884e4b7-kube-api-access-brrfv\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088771 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlrv\" (UniqueName: \"kubernetes.io/projected/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-kube-api-access-5rlrv\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088841 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data-custom\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088868 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-combined-ca-bundle\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.088897 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-combined-ca-bundle\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.110351 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ee4654-5dd5-4c14-9985-1037a884e4b7-logs\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.114182 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-logs\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.120509 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-combined-ca-bundle\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.132593 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brrfv\" (UniqueName: \"kubernetes.io/projected/e6ee4654-5dd5-4c14-9985-1037a884e4b7-kube-api-access-brrfv\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.142369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlrv\" (UniqueName: \"kubernetes.io/projected/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-kube-api-access-5rlrv\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.143742 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.168870 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.288885 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-combined-ca-bundle\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.288993 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.295546 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6ee4654-5dd5-4c14-9985-1037a884e4b7-config-data-custom\") pod \"barbican-worker-577684486f-vqq72\" (UID: \"e6ee4654-5dd5-4c14-9985-1037a884e4b7\") " pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.318765 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-577684486f-vqq72" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.335119 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.342636 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17dd60e5-2f5d-4f7d-b694-9fa3245dc207-config-data-custom\") pod \"barbican-keystone-listener-795464b486-tr8rf\" (UID: \"17dd60e5-2f5d-4f7d-b694-9fa3245dc207\") " pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.355863 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.572572 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.592438 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.592662 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.592917 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.593136 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzz9\" (UniqueName: \"kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.593392 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.593477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.695273 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.695803 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.695885 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.695959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.696030 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzz9\" (UniqueName: \"kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.696097 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.696133 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.697290 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.697515 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.701368 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.702620 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.708467 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.709023 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.726591 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.772758 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.789150 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.789367 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.801183 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzz9\" (UniqueName: \"kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9\") pod \"dnsmasq-dns-85ff748b95-q5847\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.815301 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.815448 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.815506 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.815641 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hshd2\" (UniqueName: \"kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.815987 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.818285 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-g6rbc" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.818599 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.818775 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.818986 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 12:37:26 crc kubenswrapper[4760]: I1204 12:37:26.987905 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hshd2\" (UniqueName: \"kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.008794 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.039690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.040010 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.040119 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.040243 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.055970 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.057584 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.092201 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.114534 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.121469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hshd2\" (UniqueName: \"kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.132499 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data\") pod \"barbican-api-7fcdc7577b-rkxk5\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162056 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162198 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpg7b\" (UniqueName: \"kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162297 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162364 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162481 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.162532 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265082 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265249 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265400 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265501 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpg7b\" (UniqueName: \"kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265547 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.265598 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.268174 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.269165 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.277131 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.286346 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.288615 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.290270 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.326104 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.327823 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.347452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpg7b\" (UniqueName: \"kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b\") pod \"cinder-scheduler-0\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373623 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373689 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373724 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373793 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373840 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373878 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373932 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.373990 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374018 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374056 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374095 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374136 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374153 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9csgk\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374283 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.374321 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.420692 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476375 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476457 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476506 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476552 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476585 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476639 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476727 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476790 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476871 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476924 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.476978 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.477012 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9csgk\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.477093 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.477132 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.479235 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.479743 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.480094 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.480128 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.480148 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.480558 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.481473 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.481516 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.481552 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.481557 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.481667 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.491143 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.497180 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.497805 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.498527 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.508113 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.519456 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.522423 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.526946 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.537994 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.539315 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.549086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9csgk\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk\") pod \"cinder-volume-volume1-0\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.585409 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735236 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735326 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735370 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735406 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735571 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735605 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735634 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735686 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735766 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735837 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.735961 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.736455 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwk8\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.737012 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.737049 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.737090 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.737108 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.789704 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.796566 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845013 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845103 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845147 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845185 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845266 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwk8\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845328 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845351 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845374 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845402 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845460 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845499 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845523 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845589 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845616 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845640 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845722 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845835 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845871 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btfv\" (UniqueName: \"kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.845898 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.846991 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849022 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849264 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849285 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849359 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.849376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.853238 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.853339 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.853384 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.858357 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.859858 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.875582 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.875702 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.876140 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.914888 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwk8\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.915766 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom\") pod \"cinder-backup-0\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " pod="openstack/cinder-backup-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.949333 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.949752 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.949777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.949965 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.950125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.950154 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5btfv\" (UniqueName: \"kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.952480 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.954120 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.955275 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.955906 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.956155 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.957361 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.973600 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.973779 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:37:27 crc kubenswrapper[4760]: I1204 12:37:27.982876 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.021855 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btfv\" (UniqueName: \"kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv\") pod \"dnsmasq-dns-5c9776ccc5-cc4k9\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.056606 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.057124 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhcp\" (UniqueName: \"kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.063785 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.064398 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.064482 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.064759 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.065138 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.357124 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.357186 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.367407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.367675 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.368034 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.368241 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhcp\" (UniqueName: \"kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.368380 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.374906 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.376440 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.380044 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.382736 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.415059 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhcp\" (UniqueName: \"kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.418631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-795464b486-tr8rf"] Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.419601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.421321 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.471045 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts\") pod \"cinder-api-0\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: E1204 12:37:28.487781 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23335f60_d3db_4308_b1fe_a4603a8d65e7.slice/crio-5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23335f60_d3db_4308_b1fe_a4603a8d65e7.slice/crio-conmon-5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811.scope\": RecentStats: unable to find data in memory cache]" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.576858 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-577684486f-vqq72"] Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.621419 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.720755 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.740602 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.802981 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.862128 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:37:28 crc kubenswrapper[4760]: I1204 12:37:28.926964 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.080004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerStarted","Data":"0b361e9ecf4bbab8736a42319953d08ec796951c6ea844eb51df929c3682f97a"} Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.089706 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-577684486f-vqq72" event={"ID":"e6ee4654-5dd5-4c14-9985-1037a884e4b7","Type":"ContainerStarted","Data":"ae0722885df52ec219663826464b0ad0ffe6dc2973075e9101e74c5fcf6bc788"} Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.128906 4760 generic.go:334] "Generic (PLEG): container finished" podID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerID="5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811" exitCode=0 Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.129122 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerDied","Data":"5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811"} Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.148620 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-q5847" event={"ID":"2bfe122b-75c9-4a7e-be77-5a8265ec4859","Type":"ContainerStarted","Data":"8825a86cf4a54ed325282a83c5d64f9fd2da02949d26940ba56bd4bbb0135821"} Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.165781 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" event={"ID":"17dd60e5-2f5d-4f7d-b694-9fa3245dc207","Type":"ContainerStarted","Data":"cccdcfd3279b91270ca354ad425f9139f645ef42b0e49a3b629d3a4f3aca58af"} Dec 04 12:37:29 crc kubenswrapper[4760]: I1204 12:37:29.804590 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:29 crc kubenswrapper[4760]: W1204 12:37:29.903589 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b719f3_24b5_44b8_9726_31c19ecc45b2.slice/crio-602ca75d518ab2b5c30e9a7379238160d6adc2113f4ba2b335b0d4f14185facb WatchSource:0}: Error finding container 602ca75d518ab2b5c30e9a7379238160d6adc2113f4ba2b335b0d4f14185facb: Status 404 returned error can't find the container with id 602ca75d518ab2b5c30e9a7379238160d6adc2113f4ba2b335b0d4f14185facb Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.110250 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.130123 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.265570 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" event={"ID":"fc1d29a4-5ae9-4726-b17c-5a1494bb9240","Type":"ContainerStarted","Data":"73f028c1745ef8f9a481bbaff344a7eefd4b5e63fc00c588014c50e3f37aaccf"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.297083 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerStarted","Data":"a4535353afb3bf0c9620eef21b7ac71234e6d0d9d69a0cb10f743093b61cb1a2"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.302963 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerStarted","Data":"602ca75d518ab2b5c30e9a7379238160d6adc2113f4ba2b335b0d4f14185facb"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.317612 4760 generic.go:334] "Generic (PLEG): container finished" podID="2bfe122b-75c9-4a7e-be77-5a8265ec4859" containerID="1d9ee2ee2dfb918ad0e8fcec5d7144bf1a8f96ec975a9b4e7f68792ac4e7cd0f" exitCode=0 Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.317773 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-q5847" event={"ID":"2bfe122b-75c9-4a7e-be77-5a8265ec4859","Type":"ContainerDied","Data":"1d9ee2ee2dfb918ad0e8fcec5d7144bf1a8f96ec975a9b4e7f68792ac4e7cd0f"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.352756 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerStarted","Data":"04efaff8b456f78b9ea2a869f0869ba1537e11ced6a1f5d00ed73338178777e1"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.352843 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerStarted","Data":"7fe1ef09ee1da765e3a604e9ec0edd8d4ff53f1d7cf3b3e4fb79c2814c451c56"} Dec 04 12:37:30 crc kubenswrapper[4760]: I1204 12:37:30.641132 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:37:30 crc kubenswrapper[4760]: W1204 12:37:30.683722 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod407783b0_acfe_48e9_87a8_83ba4c28ab08.slice/crio-57521c421cab46d902113e4db9b9d04db07bf9842f026af9a0ff6e300e75500c WatchSource:0}: Error finding container 57521c421cab46d902113e4db9b9d04db07bf9842f026af9a0ff6e300e75500c: Status 404 returned error can't find the container with id 57521c421cab46d902113e4db9b9d04db07bf9842f026af9a0ff6e300e75500c Dec 04 12:37:31 crc kubenswrapper[4760]: I1204 12:37:31.570673 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerStarted","Data":"57521c421cab46d902113e4db9b9d04db07bf9842f026af9a0ff6e300e75500c"} Dec 04 12:37:31 crc kubenswrapper[4760]: I1204 12:37:31.600720 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.043895 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.170818 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.171115 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.253804 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.253907 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnzz9\" (UniqueName: \"kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.253986 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.254069 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc\") pod \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\" (UID: \"2bfe122b-75c9-4a7e-be77-5a8265ec4859\") " Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.284155 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.304534 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9" (OuterVolumeSpecName: "kube-api-access-jnzz9") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "kube-api-access-jnzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.333084 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config" (OuterVolumeSpecName: "config") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.353172 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.359182 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.359242 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.359256 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnzz9\" (UniqueName: \"kubernetes.io/projected/2bfe122b-75c9-4a7e-be77-5a8265ec4859-kube-api-access-jnzz9\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.359268 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.382484 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.419014 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2bfe122b-75c9-4a7e-be77-5a8265ec4859" (UID: "2bfe122b-75c9-4a7e-be77-5a8265ec4859"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.463082 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.463440 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bfe122b-75c9-4a7e-be77-5a8265ec4859-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.637024 4760 generic.go:334] "Generic (PLEG): container finished" podID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerID="c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2" exitCode=0 Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.637261 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" event={"ID":"fc1d29a4-5ae9-4726-b17c-5a1494bb9240","Type":"ContainerDied","Data":"c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2"} Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.641794 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerStarted","Data":"56e236b5971fa53c2c8d8e9a58369e07013b3b63ac37b9efb7a3bf79ebe6d9ef"} Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.658989 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-q5847" event={"ID":"2bfe122b-75c9-4a7e-be77-5a8265ec4859","Type":"ContainerDied","Data":"8825a86cf4a54ed325282a83c5d64f9fd2da02949d26940ba56bd4bbb0135821"} Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.659059 4760 scope.go:117] "RemoveContainer" containerID="1d9ee2ee2dfb918ad0e8fcec5d7144bf1a8f96ec975a9b4e7f68792ac4e7cd0f" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.659279 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-q5847" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.698550 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerStarted","Data":"5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83"} Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.699797 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.699825 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.829724 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podStartSLOduration=6.82968738 podStartE2EDuration="6.82968738s" podCreationTimestamp="2025-12-04 12:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:32.786938214 +0000 UTC m=+1455.828384791" watchObservedRunningTime="2025-12-04 12:37:32.82968738 +0000 UTC m=+1455.871133957" Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.928654 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:32 crc kubenswrapper[4760]: I1204 12:37:32.936747 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-q5847"] Dec 04 12:37:33 crc kubenswrapper[4760]: I1204 12:37:33.712429 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerStarted","Data":"886f21ac4a1cc4d6ecfb9de0e981d6b14652a442d16cd965945fd4e25e0bc18a"} Dec 04 12:37:33 crc kubenswrapper[4760]: I1204 12:37:33.716479 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerStarted","Data":"f366d4b6db5db971446a15c91ce9b932b1a50251bb74ea4ce78e639a44d3523d"} Dec 04 12:37:33 crc kubenswrapper[4760]: I1204 12:37:33.877951 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfe122b-75c9-4a7e-be77-5a8265ec4859" path="/var/lib/kubelet/pods/2bfe122b-75c9-4a7e-be77-5a8265ec4859/volumes" Dec 04 12:37:33 crc kubenswrapper[4760]: I1204 12:37:33.916113 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:37:34 crc kubenswrapper[4760]: I1204 12:37:34.179259 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.605925 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-658b6c7fb4-vh8cp"] Dec 04 12:37:35 crc kubenswrapper[4760]: E1204 12:37:35.608386 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfe122b-75c9-4a7e-be77-5a8265ec4859" containerName="init" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.608410 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfe122b-75c9-4a7e-be77-5a8265ec4859" containerName="init" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.608680 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfe122b-75c9-4a7e-be77-5a8265ec4859" containerName="init" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.614177 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.690811 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-combined-ca-bundle\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.690901 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-public-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.690992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.691069 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc2nh\" (UniqueName: \"kubernetes.io/projected/ac9e67e5-eea3-4608-bd30-8483225d28d2-kube-api-access-mc2nh\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.691124 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data-custom\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.720116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9e67e5-eea3-4608-bd30-8483225d28d2-logs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.720482 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-internal-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.737659 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.738133 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.798895 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-658b6c7fb4-vh8cp"] Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.824685 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc2nh\" (UniqueName: \"kubernetes.io/projected/ac9e67e5-eea3-4608-bd30-8483225d28d2-kube-api-access-mc2nh\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826160 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data-custom\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826341 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9e67e5-eea3-4608-bd30-8483225d28d2-logs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826462 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-internal-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826542 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-combined-ca-bundle\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.826572 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-public-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.829052 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac9e67e5-eea3-4608-bd30-8483225d28d2-logs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.900463 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc2nh\" (UniqueName: \"kubernetes.io/projected/ac9e67e5-eea3-4608-bd30-8483225d28d2-kube-api-access-mc2nh\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.901086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-internal-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.901222 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-combined-ca-bundle\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.906738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.917555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-config-data-custom\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:35 crc kubenswrapper[4760]: I1204 12:37:35.921858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac9e67e5-eea3-4608-bd30-8483225d28d2-public-tls-certs\") pod \"barbican-api-658b6c7fb4-vh8cp\" (UID: \"ac9e67e5-eea3-4608-bd30-8483225d28d2\") " pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.077656 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.918585 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-577684486f-vqq72" event={"ID":"e6ee4654-5dd5-4c14-9985-1037a884e4b7","Type":"ContainerStarted","Data":"f4596bbf5ee4c22f9b792564526daff8da3360944aa3cb4b059e249a86103ae0"} Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.929104 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerStarted","Data":"157e958df3c3bf7b5f810d87d1acac2edee3d24cc48f0c80a6d939dad0c4c425"} Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.934151 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerStarted","Data":"3fe0632f7c778fa9eb7f4dd79d236e1034e4c66caedb12739977e04f4cbb0f55"} Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.964332 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" event={"ID":"17dd60e5-2f5d-4f7d-b694-9fa3245dc207","Type":"ContainerStarted","Data":"eb8bc386cb6636896c0706f8574e4cc1810fcdda4b3bd72d5f4d1e16ce4372a1"} Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.980293 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" event={"ID":"fc1d29a4-5ae9-4726-b17c-5a1494bb9240","Type":"ContainerStarted","Data":"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521"} Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.980990 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:36 crc kubenswrapper[4760]: I1204 12:37:36.991393 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=9.343964871 podStartE2EDuration="10.991351174s" podCreationTimestamp="2025-12-04 12:37:26 +0000 UTC" firstStartedPulling="2025-12-04 12:37:30.16986832 +0000 UTC m=+1453.211314887" lastFinishedPulling="2025-12-04 12:37:31.817254623 +0000 UTC m=+1454.858701190" observedRunningTime="2025-12-04 12:37:36.972682122 +0000 UTC m=+1460.014128689" watchObservedRunningTime="2025-12-04 12:37:36.991351174 +0000 UTC m=+1460.032797751" Dec 04 12:37:37 crc kubenswrapper[4760]: I1204 12:37:37.023158 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" podStartSLOduration=5.240872714 podStartE2EDuration="12.023123973s" podCreationTimestamp="2025-12-04 12:37:25 +0000 UTC" firstStartedPulling="2025-12-04 12:37:28.808683702 +0000 UTC m=+1451.850130269" lastFinishedPulling="2025-12-04 12:37:35.590934961 +0000 UTC m=+1458.632381528" observedRunningTime="2025-12-04 12:37:37.012633809 +0000 UTC m=+1460.054080376" watchObservedRunningTime="2025-12-04 12:37:37.023123973 +0000 UTC m=+1460.064570540" Dec 04 12:37:37 crc kubenswrapper[4760]: I1204 12:37:37.079641 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" podStartSLOduration=11.079577674 podStartE2EDuration="11.079577674s" podCreationTimestamp="2025-12-04 12:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:37.05580698 +0000 UTC m=+1460.097253547" watchObservedRunningTime="2025-12-04 12:37:37.079577674 +0000 UTC m=+1460.121024251" Dec 04 12:37:37 crc kubenswrapper[4760]: I1204 12:37:37.216342 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-658b6c7fb4-vh8cp"] Dec 04 12:37:37 crc kubenswrapper[4760]: W1204 12:37:37.274453 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac9e67e5_eea3_4608_bd30_8483225d28d2.slice/crio-b7e0762804d2949d604bebb678db0aad0a1d413b2829e7d4a3ddcb3c3a106dd6 WatchSource:0}: Error finding container b7e0762804d2949d604bebb678db0aad0a1d413b2829e7d4a3ddcb3c3a106dd6: Status 404 returned error can't find the container with id b7e0762804d2949d604bebb678db0aad0a1d413b2829e7d4a3ddcb3c3a106dd6 Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.073894 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-658b6c7fb4-vh8cp" event={"ID":"ac9e67e5-eea3-4608-bd30-8483225d28d2","Type":"ContainerStarted","Data":"e6831d8b238b53c506406335d14371aad48e7ce8672653cdd1d094ee1c599732"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.074442 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-658b6c7fb4-vh8cp" event={"ID":"ac9e67e5-eea3-4608-bd30-8483225d28d2","Type":"ContainerStarted","Data":"b7e0762804d2949d604bebb678db0aad0a1d413b2829e7d4a3ddcb3c3a106dd6"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.161097 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795464b486-tr8rf" event={"ID":"17dd60e5-2f5d-4f7d-b694-9fa3245dc207","Type":"ContainerStarted","Data":"adf1060ffd0b4907769f3e6612840378347242241ca331fb8a18c3dfe616f083"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.202160 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerStarted","Data":"80d98ed92e4cf693130940a12fe5fba680c5bb85116080dd49cfda916ea9541b"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.218603 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerStarted","Data":"6caefa18364108968ba835bd145212d045b8b3af109c2b69532fb85ba3e91082"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.218870 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api-log" containerID="cri-o://886f21ac4a1cc4d6ecfb9de0e981d6b14652a442d16cd965945fd4e25e0bc18a" gracePeriod=30 Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.219320 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.219370 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" containerID="cri-o://6caefa18364108968ba835bd145212d045b8b3af109c2b69532fb85ba3e91082" gracePeriod=30 Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.279127 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.983209667 podStartE2EDuration="12.279084881s" podCreationTimestamp="2025-12-04 12:37:26 +0000 UTC" firstStartedPulling="2025-12-04 12:37:28.705744615 +0000 UTC m=+1451.747191182" lastFinishedPulling="2025-12-04 12:37:30.001619829 +0000 UTC m=+1453.043066396" observedRunningTime="2025-12-04 12:37:38.240901909 +0000 UTC m=+1461.282348476" watchObservedRunningTime="2025-12-04 12:37:38.279084881 +0000 UTC m=+1461.320531448" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.291582 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-577684486f-vqq72" event={"ID":"e6ee4654-5dd5-4c14-9985-1037a884e4b7","Type":"ContainerStarted","Data":"2725fbe0b595a070e28222066aff8948dd59624dd6b2b173ae58cb81a9b7ba2b"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.319784 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerStarted","Data":"dfe0b7d287d85a7f8a4a34dfcfead054005af809d055a631c07c06ddceff017e"} Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.347710 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=11.347665467 podStartE2EDuration="11.347665467s" podCreationTimestamp="2025-12-04 12:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:38.31086747 +0000 UTC m=+1461.352314037" watchObservedRunningTime="2025-12-04 12:37:38.347665467 +0000 UTC m=+1461.389112034" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.405981 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-577684486f-vqq72" podStartSLOduration=6.566774613 podStartE2EDuration="13.405933938s" podCreationTimestamp="2025-12-04 12:37:25 +0000 UTC" firstStartedPulling="2025-12-04 12:37:28.706313152 +0000 UTC m=+1451.747759719" lastFinishedPulling="2025-12-04 12:37:35.545472477 +0000 UTC m=+1458.586919044" observedRunningTime="2025-12-04 12:37:38.356804338 +0000 UTC m=+1461.398250905" watchObservedRunningTime="2025-12-04 12:37:38.405933938 +0000 UTC m=+1461.447380505" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.451788 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.625609 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 12:37:38 crc kubenswrapper[4760]: I1204 12:37:38.630137 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.164:8080/\": dial tcp 10.217.0.164:8080: connect: connection refused" Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.372695 4760 generic.go:334] "Generic (PLEG): container finished" podID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerID="886f21ac4a1cc4d6ecfb9de0e981d6b14652a442d16cd965945fd4e25e0bc18a" exitCode=143 Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.372791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerDied","Data":"886f21ac4a1cc4d6ecfb9de0e981d6b14652a442d16cd965945fd4e25e0bc18a"} Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.377355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-658b6c7fb4-vh8cp" event={"ID":"ac9e67e5-eea3-4608-bd30-8483225d28d2","Type":"ContainerStarted","Data":"b6e403de95807132b815b4781118e1fb539560741bcb089befce6cdca56fd95a"} Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.378536 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.378602 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.433972 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podStartSLOduration=4.433934069 podStartE2EDuration="4.433934069s" podCreationTimestamp="2025-12-04 12:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:39.42734864 +0000 UTC m=+1462.468795207" watchObservedRunningTime="2025-12-04 12:37:39.433934069 +0000 UTC m=+1462.475380646" Dec 04 12:37:39 crc kubenswrapper[4760]: I1204 12:37:39.456156 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=7.933134558 podStartE2EDuration="13.456121724s" podCreationTimestamp="2025-12-04 12:37:26 +0000 UTC" firstStartedPulling="2025-12-04 12:37:29.990499597 +0000 UTC m=+1453.031946164" lastFinishedPulling="2025-12-04 12:37:35.513486762 +0000 UTC m=+1458.554933330" observedRunningTime="2025-12-04 12:37:38.527143155 +0000 UTC m=+1461.568589732" watchObservedRunningTime="2025-12-04 12:37:39.456121724 +0000 UTC m=+1462.497568291" Dec 04 12:37:42 crc kubenswrapper[4760]: I1204 12:37:42.297275 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:42 crc kubenswrapper[4760]: I1204 12:37:42.473608 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:42 crc kubenswrapper[4760]: I1204 12:37:42.528275 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 12:37:42 crc kubenswrapper[4760]: I1204 12:37:42.956976 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 12:37:43 crc kubenswrapper[4760]: I1204 12:37:43.498744 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:43 crc kubenswrapper[4760]: I1204 12:37:43.807524 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:37:43 crc kubenswrapper[4760]: I1204 12:37:43.914279 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:37:43 crc kubenswrapper[4760]: I1204 12:37:43.952571 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:37:43 crc kubenswrapper[4760]: I1204 12:37:43.952931 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="dnsmasq-dns" containerID="cri-o://0c104678c9e693c103e49cb8df74e189aefcc30b8ceeccc2ff4d0876a9b439d4" gracePeriod=10 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.175692 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.175843 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.177268 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c"} pod="openstack/horizon-5b7fc6c944-sh7tv" containerMessage="Container horizon failed startup probe, will be restarted" Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.177332 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" containerID="cri-o://7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c" gracePeriod=30 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.199021 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.298840 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.439099 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.535616 4760 generic.go:334] "Generic (PLEG): container finished" podID="d085487a-6288-4d8c-87b0-11b7924141f4" containerID="0c104678c9e693c103e49cb8df74e189aefcc30b8ceeccc2ff4d0876a9b439d4" exitCode=0 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.536129 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="cinder-scheduler" containerID="cri-o://56e236b5971fa53c2c8d8e9a58369e07013b3b63ac37b9efb7a3bf79ebe6d9ef" gracePeriod=30 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.544982 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" event={"ID":"d085487a-6288-4d8c-87b0-11b7924141f4","Type":"ContainerDied","Data":"0c104678c9e693c103e49cb8df74e189aefcc30b8ceeccc2ff4d0876a9b439d4"} Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.545136 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="probe" containerID="cri-o://80d98ed92e4cf693130940a12fe5fba680c5bb85116080dd49cfda916ea9541b" gracePeriod=30 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.546979 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="cinder-volume" containerID="cri-o://3fe0632f7c778fa9eb7f4dd79d236e1034e4c66caedb12739977e04f4cbb0f55" gracePeriod=30 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.547515 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="probe" containerID="cri-o://dfe0b7d287d85a7f8a4a34dfcfead054005af809d055a631c07c06ddceff017e" gracePeriod=30 Dec 04 12:37:44 crc kubenswrapper[4760]: I1204 12:37:44.681334 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.205765 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.305980 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.306121 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.306375 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcd5b\" (UniqueName: \"kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.306468 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.306555 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.306697 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc\") pod \"d085487a-6288-4d8c-87b0-11b7924141f4\" (UID: \"d085487a-6288-4d8c-87b0-11b7924141f4\") " Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.331680 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b" (OuterVolumeSpecName: "kube-api-access-xcd5b") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "kube-api-access-xcd5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.417630 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcd5b\" (UniqueName: \"kubernetes.io/projected/d085487a-6288-4d8c-87b0-11b7924141f4-kube-api-access-xcd5b\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.532555 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.540190 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.541302 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config" (OuterVolumeSpecName: "config") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.547491 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.547800 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.547899 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.557898 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.590050 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d085487a-6288-4d8c-87b0-11b7924141f4" (UID: "d085487a-6288-4d8c-87b0-11b7924141f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.634356 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" event={"ID":"d085487a-6288-4d8c-87b0-11b7924141f4","Type":"ContainerDied","Data":"96b9ed8efc3f0f69ebf57bb6fe629fb748af00059600c674abe5f071d2d02975"} Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.635027 4760 scope.go:117] "RemoveContainer" containerID="0c104678c9e693c103e49cb8df74e189aefcc30b8ceeccc2ff4d0876a9b439d4" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.634428 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-vt25n" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.650863 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.650906 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d085487a-6288-4d8c-87b0-11b7924141f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.683653 4760 generic.go:334] "Generic (PLEG): container finished" podID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" containerID="5ba953f8ee30f593c2b31c939e252e989f85ce1c744c6b0bda4c4dd9dfefa0c1" exitCode=0 Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.683993 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="cinder-backup" containerID="cri-o://f366d4b6db5db971446a15c91ce9b932b1a50251bb74ea4ce78e639a44d3523d" gracePeriod=30 Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.684112 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rb22w" event={"ID":"6e2d78cb-0c7a-408f-a736-6630b41bd80b","Type":"ContainerDied","Data":"5ba953f8ee30f593c2b31c939e252e989f85ce1c744c6b0bda4c4dd9dfefa0c1"} Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.684626 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="probe" containerID="cri-o://157e958df3c3bf7b5f810d87d1acac2edee3d24cc48f0c80a6d939dad0c4c425" gracePeriod=30 Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.755548 4760 scope.go:117] "RemoveContainer" containerID="b60f07b58036e539ad0748a38a481f6bd0cc9b488bffc6df45e4dd38d0e2a807" Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.790978 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.806566 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-vt25n"] Dec 04 12:37:45 crc kubenswrapper[4760]: I1204 12:37:45.996964 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" path="/var/lib/kubelet/pods/d085487a-6288-4d8c-87b0-11b7924141f4/volumes" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.268760 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d74ff5d57-x29r6" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.664071 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:46 crc kubenswrapper[4760]: E1204 12:37:46.664869 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="dnsmasq-dns" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.664897 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="dnsmasq-dns" Dec 04 12:37:46 crc kubenswrapper[4760]: E1204 12:37:46.664917 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="init" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.664924 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="init" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.665160 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d085487a-6288-4d8c-87b0-11b7924141f4" containerName="dnsmasq-dns" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.673058 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.678634 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.680402 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-67s2j" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.686865 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.687672 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.690886 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.690992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.691068 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.691350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mts\" (UniqueName: \"kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.724721 4760 generic.go:334] "Generic (PLEG): container finished" podID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerID="56e236b5971fa53c2c8d8e9a58369e07013b3b63ac37b9efb7a3bf79ebe6d9ef" exitCode=0 Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.724854 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerDied","Data":"56e236b5971fa53c2c8d8e9a58369e07013b3b63ac37b9efb7a3bf79ebe6d9ef"} Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.736537 4760 generic.go:334] "Generic (PLEG): container finished" podID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerID="3fe0632f7c778fa9eb7f4dd79d236e1034e4c66caedb12739977e04f4cbb0f55" exitCode=0 Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.736643 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerDied","Data":"3fe0632f7c778fa9eb7f4dd79d236e1034e4c66caedb12739977e04f4cbb0f55"} Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.793872 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.793992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mts\" (UniqueName: \"kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.794084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.794151 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.795156 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.805401 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.821965 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:46 crc kubenswrapper[4760]: I1204 12:37:46.837247 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mts\" (UniqueName: \"kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts\") pod \"openstackclient\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.007092 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.010374 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.020838 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.113683 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.115514 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.188608 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.212204 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config-secret\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.212280 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.212310 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4d5\" (UniqueName: \"kubernetes.io/projected/edbb46fc-c8ec-45c9-bdb2-36639d92402e-kube-api-access-7z4d5\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.212384 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.315289 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config-secret\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.315359 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.315390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4d5\" (UniqueName: \"kubernetes.io/projected/edbb46fc-c8ec-45c9-bdb2-36639d92402e-kube-api-access-7z4d5\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.315471 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.318732 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.324873 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.333015 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/edbb46fc-c8ec-45c9-bdb2-36639d92402e-openstack-config-secret\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.365330 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4d5\" (UniqueName: \"kubernetes.io/projected/edbb46fc-c8ec-45c9-bdb2-36639d92402e-kube-api-access-7z4d5\") pod \"openstackclient\" (UID: \"edbb46fc-c8ec-45c9-bdb2-36639d92402e\") " pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.556848 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.598129 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.618572 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:37:47 crc kubenswrapper[4760]: E1204 12:37:47.639980 4760 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 12:37:47 crc kubenswrapper[4760]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6b3b3252-d161-485f-8cba-5da8dd2b3c7e_0(2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb" Netns:"/var/run/netns/fcd1a0fe-d50b-4e9c-b82f-eb8c71e46bce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb;K8S_POD_UID=6b3b3252-d161-485f-8cba-5da8dd2b3c7e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/6b3b3252-d161-485f-8cba-5da8dd2b3c7e]: expected pod UID "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" but got "edbb46fc-c8ec-45c9-bdb2-36639d92402e" from Kube API Dec 04 12:37:47 crc kubenswrapper[4760]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 12:37:47 crc kubenswrapper[4760]: > Dec 04 12:37:47 crc kubenswrapper[4760]: E1204 12:37:47.640069 4760 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 12:37:47 crc kubenswrapper[4760]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6b3b3252-d161-485f-8cba-5da8dd2b3c7e_0(2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb" Netns:"/var/run/netns/fcd1a0fe-d50b-4e9c-b82f-eb8c71e46bce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2e434d99ae75d87b6c6e86902306d64c417c677dc2292093dd474af81dc17ddb;K8S_POD_UID=6b3b3252-d161-485f-8cba-5da8dd2b3c7e" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/6b3b3252-d161-485f-8cba-5da8dd2b3c7e]: expected pod UID "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" but got "edbb46fc-c8ec-45c9-bdb2-36639d92402e" from Kube API Dec 04 12:37:47 crc kubenswrapper[4760]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 12:37:47 crc kubenswrapper[4760]: > pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.830454 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.836675 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b3b3252-d161-485f-8cba-5da8dd2b3c7e" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.904814 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:47 crc kubenswrapper[4760]: I1204 12:37:47.935917 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b3b3252-d161-485f-8cba-5da8dd2b3c7e" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.026440 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config\") pod \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.026530 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mts\" (UniqueName: \"kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts\") pod \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.026759 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret\") pod \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.026831 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle\") pod \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\" (UID: \"6b3b3252-d161-485f-8cba-5da8dd2b3c7e\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.027361 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" (UID: "6b3b3252-d161-485f-8cba-5da8dd2b3c7e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.027724 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.060201 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" (UID: "6b3b3252-d161-485f-8cba-5da8dd2b3c7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.078438 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" (UID: "6b3b3252-d161-485f-8cba-5da8dd2b3c7e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.083581 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts" (OuterVolumeSpecName: "kube-api-access-42mts") pod "6b3b3252-d161-485f-8cba-5da8dd2b3c7e" (UID: "6b3b3252-d161-485f-8cba-5da8dd2b3c7e"). InnerVolumeSpecName "kube-api-access-42mts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.139983 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.150655 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.150909 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mts\" (UniqueName: \"kubernetes.io/projected/6b3b3252-d161-485f-8cba-5da8dd2b3c7e-kube-api-access-42mts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.152096 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rb22w" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.190613 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.252948 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle\") pod \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.253016 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpr4j\" (UniqueName: \"kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j\") pod \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.253062 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data\") pod \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.253242 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data\") pod \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\" (UID: \"6e2d78cb-0c7a-408f-a736-6630b41bd80b\") " Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.281536 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "6e2d78cb-0c7a-408f-a736-6630b41bd80b" (UID: "6e2d78cb-0c7a-408f-a736-6630b41bd80b"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.283388 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data" (OuterVolumeSpecName: "config-data") pod "6e2d78cb-0c7a-408f-a736-6630b41bd80b" (UID: "6e2d78cb-0c7a-408f-a736-6630b41bd80b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.286649 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j" (OuterVolumeSpecName: "kube-api-access-fpr4j") pod "6e2d78cb-0c7a-408f-a736-6630b41bd80b" (UID: "6e2d78cb-0c7a-408f-a736-6630b41bd80b"). InnerVolumeSpecName "kube-api-access-fpr4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.322404 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e2d78cb-0c7a-408f-a736-6630b41bd80b" (UID: "6e2d78cb-0c7a-408f-a736-6630b41bd80b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.357280 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.357337 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpr4j\" (UniqueName: \"kubernetes.io/projected/6e2d78cb-0c7a-408f-a736-6630b41bd80b-kube-api-access-fpr4j\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.357358 4760 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.357370 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2d78cb-0c7a-408f-a736-6630b41bd80b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.819999 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.843501 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6999bbbcbb-mwnkr" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.886361 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.955813 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.166:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:48 crc kubenswrapper[4760]: W1204 12:37:48.962465 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedbb46fc_c8ec_45c9_bdb2_36639d92402e.slice/crio-312d9a8627a2f57126629b9778529343dd0a7e09dc62ea7f9d4d3c146ca19011 WatchSource:0}: Error finding container 312d9a8627a2f57126629b9778529343dd0a7e09dc62ea7f9d4d3c146ca19011: Status 404 returned error can't find the container with id 312d9a8627a2f57126629b9778529343dd0a7e09dc62ea7f9d4d3c146ca19011 Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.991358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rb22w" event={"ID":"6e2d78cb-0c7a-408f-a736-6630b41bd80b","Type":"ContainerDied","Data":"e0ce070d49b5495fa48fbcd38608443a459dfbd4a91e648839a36ecdb6a68db0"} Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.991974 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ce070d49b5495fa48fbcd38608443a459dfbd4a91e648839a36ecdb6a68db0" Dec 04 12:37:48 crc kubenswrapper[4760]: I1204 12:37:48.992162 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rb22w" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.094094 4760 generic.go:334] "Generic (PLEG): container finished" podID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerID="f366d4b6db5db971446a15c91ce9b932b1a50251bb74ea4ce78e639a44d3523d" exitCode=0 Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.094534 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerDied","Data":"f366d4b6db5db971446a15c91ce9b932b1a50251bb74ea4ce78e639a44d3523d"} Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.131547 4760 generic.go:334] "Generic (PLEG): container finished" podID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerID="dfe0b7d287d85a7f8a4a34dfcfead054005af809d055a631c07c06ddceff017e" exitCode=0 Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.131685 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.131806 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerDied","Data":"dfe0b7d287d85a7f8a4a34dfcfead054005af809d055a631c07c06ddceff017e"} Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.142271 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b3b3252-d161-485f-8cba-5da8dd2b3c7e" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.170344 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.181520 4760 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6b3b3252-d161-485f-8cba-5da8dd2b3c7e" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287123 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9csgk\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287189 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287248 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287302 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287322 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287399 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287437 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287574 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287629 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287669 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287699 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287770 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287786 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287813 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287837 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.287865 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder\") pod \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\" (UID: \"c7b719f3-24b5-44b8-9726-31c19ecc45b2\") " Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.289880 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.289974 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.290003 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.290027 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.290063 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run" (OuterVolumeSpecName: "run") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.290086 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys" (OuterVolumeSpecName: "sys") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.291746 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.292157 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.292758 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev" (OuterVolumeSpecName: "dev") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.292888 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.303789 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.314247 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph" (OuterVolumeSpecName: "ceph") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.315560 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk" (OuterVolumeSpecName: "kube-api-access-9csgk") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "kube-api-access-9csgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.327505 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts" (OuterVolumeSpecName: "scripts") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393089 4760 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393131 4760 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-dev\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393140 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393148 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393157 4760 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393172 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9csgk\" (UniqueName: \"kubernetes.io/projected/c7b719f3-24b5-44b8-9726-31c19ecc45b2-kube-api-access-9csgk\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393181 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393189 4760 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393197 4760 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393226 4760 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393235 4760 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393247 4760 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-sys\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393255 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.393264 4760 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c7b719f3-24b5-44b8-9726-31c19ecc45b2-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.475610 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.511813 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.661812 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:37:49 crc kubenswrapper[4760]: E1204 12:37:49.662569 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" containerName="manila-db-sync" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.662596 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" containerName="manila-db-sync" Dec 04 12:37:49 crc kubenswrapper[4760]: E1204 12:37:49.662621 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="cinder-volume" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.662629 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="cinder-volume" Dec 04 12:37:49 crc kubenswrapper[4760]: E1204 12:37:49.662649 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="probe" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.662661 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="probe" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.664595 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="probe" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.664661 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" containerName="cinder-volume" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.664679 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" containerName="manila-db-sync" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.668139 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.681748 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-tnswh" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.682092 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.682356 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.682754 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.755069 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.773604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.773890 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfxk\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.774061 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.776110 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.776172 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.776272 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.776665 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.784533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.812094 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.817585 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.897334 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfxk\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905191 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905487 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905513 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905565 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905650 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905704 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.905783 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.906682 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.912251 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.913075 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.917410 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.918237 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data" (OuterVolumeSpecName: "config-data") pod "c7b719f3-24b5-44b8-9726-31c19ecc45b2" (UID: "c7b719f3-24b5-44b8-9726-31c19ecc45b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.918696 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.921678 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3b3252-d161-485f-8cba-5da8dd2b3c7e" path="/var/lib/kubelet/pods/6b3b3252-d161-485f-8cba-5da8dd2b3c7e/volumes" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.923092 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.936142 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.955776 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:37:49 crc kubenswrapper[4760]: I1204 12:37:49.957641 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfxk\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk\") pod \"manila-share-share1-0\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " pod="openstack/manila-share-share1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.009186 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.009669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.009826 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.009943 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rz2k\" (UniqueName: \"kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.010074 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.010553 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.010938 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b719f3-24b5-44b8-9726-31c19ecc45b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.025336 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.028069 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.085543 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.086697 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.087833 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115382 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115534 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115588 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115634 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rz2k\" (UniqueName: \"kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.115658 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.117479 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.128044 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.129890 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.145874 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.147497 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.216492 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rz2k\" (UniqueName: \"kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k\") pod \"manila-scheduler-0\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.218734 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.218817 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkt7\" (UniqueName: \"kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.218868 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.218930 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.218973 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.219016 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.274930 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.277647 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.277684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"edbb46fc-c8ec-45c9-bdb2-36639d92402e","Type":"ContainerStarted","Data":"312d9a8627a2f57126629b9778529343dd0a7e09dc62ea7f9d4d3c146ca19011"} Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.277828 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.281565 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.309092 4760 generic.go:334] "Generic (PLEG): container finished" podID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerID="80d98ed92e4cf693130940a12fe5fba680c5bb85116080dd49cfda916ea9541b" exitCode=0 Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.309175 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerDied","Data":"80d98ed92e4cf693130940a12fe5fba680c5bb85116080dd49cfda916ea9541b"} Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.335507 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.364661 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkt7\" (UniqueName: \"kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.364798 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.364979 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.365074 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.365136 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.342183 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.367445 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.368187 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.369144 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.369351 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.376899 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"c7b719f3-24b5-44b8-9726-31c19ecc45b2","Type":"ContainerDied","Data":"602ca75d518ab2b5c30e9a7379238160d6adc2113f4ba2b335b0d4f14185facb"} Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.376998 4760 scope.go:117] "RemoveContainer" containerID="dfe0b7d287d85a7f8a4a34dfcfead054005af809d055a631c07c06ddceff017e" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.377140 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.377409 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.399328 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkt7\" (UniqueName: \"kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7\") pod \"dnsmasq-dns-56696ff475-64jjh\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.423417 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.435470 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.437080 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.470174 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.471076 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.471393 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.474136 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.484946 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.485358 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpg7b\" (UniqueName: \"kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b\") pod \"098dadae-38a9-440e-a726-9cd9d742c6bc\" (UID: \"098dadae-38a9-440e-a726-9cd9d742c6bc\") " Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.485961 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhd8\" (UniqueName: \"kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.486150 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.488926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.489043 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.489292 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.471589 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.493986 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.494307 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts" (OuterVolumeSpecName: "scripts") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.494493 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.494706 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.494992 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.495064 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/098dadae-38a9-440e-a726-9cd9d742c6bc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.495128 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.501305 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.536700 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b" (OuterVolumeSpecName: "kube-api-access-cpg7b") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "kube-api-access-cpg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.536832 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.605885 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txhd8\" (UniqueName: \"kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.605971 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606084 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606114 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606176 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606247 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.606404 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpg7b\" (UniqueName: \"kubernetes.io/projected/098dadae-38a9-440e-a726-9cd9d742c6bc-kube-api-access-cpg7b\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.613453 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.613813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.627500 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: E1204 12:37:50.634043 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="cinder-scheduler" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.634093 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="cinder-scheduler" Dec 04 12:37:50 crc kubenswrapper[4760]: E1204 12:37:50.634107 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="probe" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.634115 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="probe" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.634381 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="probe" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.634401 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" containerName="cinder-scheduler" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.628719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.636263 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.637157 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.643124 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.647609 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.649070 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.658563 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-869c8d7d5c-srd5v" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.667390 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.715604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.744730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.715631 4760 scope.go:117] "RemoveContainer" containerID="3fe0632f7c778fa9eb7f4dd79d236e1034e4c66caedb12739977e04f4cbb0f55" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.744555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhd8\" (UniqueName: \"kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8\") pod \"manila-api-0\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.757922 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769480 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769558 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769593 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769631 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftvtp\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-kube-api-access-ftvtp\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769741 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769876 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769964 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.769992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770099 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770156 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770184 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770243 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770345 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.770465 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.790156 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.844449 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data" (OuterVolumeSpecName: "config-data") pod "098dadae-38a9-440e-a726-9cd9d742c6bc" (UID: "098dadae-38a9-440e-a726-9cd9d742c6bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882348 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882449 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882493 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882539 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882594 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882644 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882687 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882855 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882908 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.882980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftvtp\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-kube-api-access-ftvtp\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.883069 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.883200 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.883303 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.883352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.883547 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/098dadae-38a9-440e-a726-9cd9d742c6bc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.886048 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.890901 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.896677 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.896768 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.902315 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.912838 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.912918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-dev\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.913695 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-sys\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.913733 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.916806 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.916932 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.918030 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-run\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.918259 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.918376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/82e6f33c-9843-4810-9cec-5b7b7525d759-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.922172 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82e6f33c-9843-4810-9cec-5b7b7525d759-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.951417 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.986150 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.986615 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f9db4bf7b-jdfxz" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-api" containerID="cri-o://cc4bb714b5e43dd8d68e6969cad10449ea5df26f89281f0554ab6248963395e2" gracePeriod=30 Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.987309 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f9db4bf7b-jdfxz" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" containerID="cri-o://7d02f4faa6170b88d08cadd8e1179e5e036635971e3b0544bf1f05777d23a70c" gracePeriod=30 Dec 04 12:37:50 crc kubenswrapper[4760]: I1204 12:37:50.987370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftvtp\" (UniqueName: \"kubernetes.io/projected/82e6f33c-9843-4810-9cec-5b7b7525d759-kube-api-access-ftvtp\") pod \"cinder-volume-volume1-0\" (UID: \"82e6f33c-9843-4810-9cec-5b7b7525d759\") " pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.065057 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.090550 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.091072 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.142876 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.453181 4760 generic.go:334] "Generic (PLEG): container finished" podID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerID="7d02f4faa6170b88d08cadd8e1179e5e036635971e3b0544bf1f05777d23a70c" exitCode=0 Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.453447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerDied","Data":"7d02f4faa6170b88d08cadd8e1179e5e036635971e3b0544bf1f05777d23a70c"} Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.486615 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"098dadae-38a9-440e-a726-9cd9d742c6bc","Type":"ContainerDied","Data":"0b361e9ecf4bbab8736a42319953d08ec796951c6ea844eb51df929c3682f97a"} Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.486697 4760 scope.go:117] "RemoveContainer" containerID="80d98ed92e4cf693130940a12fe5fba680c5bb85116080dd49cfda916ea9541b" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.486920 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.610883 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.628041 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.653382 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.658087 4760 scope.go:117] "RemoveContainer" containerID="56e236b5971fa53c2c8d8e9a58369e07013b3b63ac37b9efb7a3bf79ebe6d9ef" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.671420 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.682378 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.687532 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.697055 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899510 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwp4\" (UniqueName: \"kubernetes.io/projected/5bf94526-eda5-4784-a223-e0ff51ec09e8-kube-api-access-rrwp4\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899617 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899693 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899714 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf94526-eda5-4784-a223-e0ff51ec09e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899752 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:51 crc kubenswrapper[4760]: I1204 12:37:51.899791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003263 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003644 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf94526-eda5-4784-a223-e0ff51ec09e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003707 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003773 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003820 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5bf94526-eda5-4784-a223-e0ff51ec09e8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003874 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwp4\" (UniqueName: \"kubernetes.io/projected/5bf94526-eda5-4784-a223-e0ff51ec09e8-kube-api-access-rrwp4\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.003986 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.017174 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="098dadae-38a9-440e-a726-9cd9d742c6bc" path="/var/lib/kubelet/pods/098dadae-38a9-440e-a726-9cd9d742c6bc/volumes" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.033825 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.033825 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.034184 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-scripts\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.058265 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwp4\" (UniqueName: \"kubernetes.io/projected/5bf94526-eda5-4784-a223-e0ff51ec09e8-kube-api-access-rrwp4\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.076342 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bf94526-eda5-4784-a223-e0ff51ec09e8-config-data\") pod \"cinder-scheduler-0\" (UID: \"5bf94526-eda5-4784-a223-e0ff51ec09e8\") " pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.104809 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b719f3-24b5-44b8-9726-31c19ecc45b2" path="/var/lib/kubelet/pods/c7b719f3-24b5-44b8-9726-31c19ecc45b2/volumes" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.105739 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.105777 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.355660 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.479320 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.593986 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.597334 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerStarted","Data":"2b50fc9d0052371ad1eb69f62a874934a543ee02474dc0bfef93f413527e945c"} Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.621174 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerStarted","Data":"47d31f3c802336c79b29ff4343e965c9619fd737346aac6fd6babeed7986065d"} Dec 04 12:37:52 crc kubenswrapper[4760]: W1204 12:37:52.653012 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82e6f33c_9843_4810_9cec_5b7b7525d759.slice/crio-b570fea63c1ddd167f49dfb32cc9f4a7e9547b316cb35226b640a562ee9f10d4 WatchSource:0}: Error finding container b570fea63c1ddd167f49dfb32cc9f4a7e9547b316cb35226b640a562ee9f10d4: Status 404 returned error can't find the container with id b570fea63c1ddd167f49dfb32cc9f4a7e9547b316cb35226b640a562ee9f10d4 Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.653057 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-64jjh" event={"ID":"120ed25b-47b0-4306-974b-a66255cac4ce","Type":"ContainerStarted","Data":"9bfd0624544275c4a27f6e882d4a6d72319abe52ef59029ba612e3dc1e1b1590"} Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.682575 4760 generic.go:334] "Generic (PLEG): container finished" podID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerID="157e958df3c3bf7b5f810d87d1acac2edee3d24cc48f0c80a6d939dad0c4c425" exitCode=0 Dec 04 12:37:52 crc kubenswrapper[4760]: I1204 12:37:52.682635 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerDied","Data":"157e958df3c3bf7b5f810d87d1acac2edee3d24cc48f0c80a6d939dad0c4c425"} Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.696643 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.759568 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerStarted","Data":"ca092249b89a27342b71e78cd224f85a8c875cbe8efa285849fc230754edccc0"} Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.775269 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.775796 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"43ec58b9-74ae-4457-928c-e20bb1c496ec","Type":"ContainerDied","Data":"a4535353afb3bf0c9620eef21b7ac71234e6d0d9d69a0cb10f743093b61cb1a2"} Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.775969 4760 scope.go:117] "RemoveContainer" containerID="157e958df3c3bf7b5f810d87d1acac2edee3d24cc48f0c80a6d939dad0c4c425" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.833787 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e6f33c-9843-4810-9cec-5b7b7525d759","Type":"ContainerStarted","Data":"b570fea63c1ddd167f49dfb32cc9f4a7e9547b316cb35226b640a562ee9f10d4"} Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840312 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840395 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840438 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840535 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840558 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840595 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpwk8\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840695 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840725 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840766 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840830 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840891 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840915 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840956 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.840989 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.841009 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.841041 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom\") pod \"43ec58b9-74ae-4457-928c-e20bb1c496ec\" (UID: \"43ec58b9-74ae-4457-928c-e20bb1c496ec\") " Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842131 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842338 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run" (OuterVolumeSpecName: "run") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842442 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev" (OuterVolumeSpecName: "dev") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842460 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842470 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.842985 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.843020 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.843046 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys" (OuterVolumeSpecName: "sys") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.843188 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.877493 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8" (OuterVolumeSpecName: "kube-api-access-wpwk8") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "kube-api-access-wpwk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.916891 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954044 4760 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-brick\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954127 4760 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954143 4760 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-nvme\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954154 4760 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-lib-modules\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954165 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954184 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpwk8\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-kube-api-access-wpwk8\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954201 4760 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-sys\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954314 4760 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.954879 4760 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-dev\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.955030 4760 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-etc-iscsi\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.955137 4760 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/43ec58b9-74ae-4457-928c-e20bb1c496ec-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.968241 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts" (OuterVolumeSpecName: "scripts") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.984336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.984636 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph" (OuterVolumeSpecName: "ceph") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:53 crc kubenswrapper[4760]: I1204 12:37:53.998451 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.166:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.022076 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.022137 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.023427 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7"} pod="openstack/horizon-66f8fb5648-87dff" containerMessage="Container horizon failed startup probe, will be restarted" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.023488 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" containerID="cri-o://850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7" gracePeriod=30 Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.027413 4760 scope.go:117] "RemoveContainer" containerID="f366d4b6db5db971446a15c91ce9b932b1a50251bb74ea4ce78e639a44d3523d" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.063525 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/43ec58b9-74ae-4457-928c-e20bb1c496ec-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.063562 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.063572 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.252404 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.272584 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.304448 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data" (OuterVolumeSpecName: "config-data") pod "43ec58b9-74ae-4457-928c-e20bb1c496ec" (UID: "43ec58b9-74ae-4457-928c-e20bb1c496ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.375938 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ec58b9-74ae-4457-928c-e20bb1c496ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.604352 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.667329 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.729422 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:54 crc kubenswrapper[4760]: E1204 12:37:54.730483 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="probe" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.730506 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="probe" Dec 04 12:37:54 crc kubenswrapper[4760]: E1204 12:37:54.730521 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="cinder-backup" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.730528 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="cinder-backup" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.730815 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="probe" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.730840 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" containerName="cinder-backup" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.732440 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.737896 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 12:37:54 crc kubenswrapper[4760]: I1204 12:37:54.790050 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.056301 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.083695 4760 generic.go:334] "Generic (PLEG): container finished" podID="120ed25b-47b0-4306-974b-a66255cac4ce" containerID="39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134" exitCode=0 Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.083805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-64jjh" event={"ID":"120ed25b-47b0-4306-974b-a66255cac4ce","Type":"ContainerDied","Data":"39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134"} Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095234 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-dev\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095276 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095301 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-ceph\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095345 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095363 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-sys\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095387 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-run\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095433 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvl2\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-kube-api-access-mvvl2\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095457 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095478 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-scripts\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095496 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095524 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095553 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-lib-modules\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095581 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095604 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095625 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.095678 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.102005 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5bf94526-eda5-4784-a223-e0ff51ec09e8","Type":"ContainerStarted","Data":"5961773b43ec0ab960349f106fac978ba30c5da174d9b875a4645424aaf46746"} Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.106858 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.200734 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e6f33c-9843-4810-9cec-5b7b7525d759","Type":"ContainerStarted","Data":"5da5615d0da033e11a6c862c613908e382a2f61d0b6626179b704d2ace530374"} Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203184 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203337 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-sys\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203386 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-run\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203465 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvl2\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-kube-api-access-mvvl2\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203491 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203513 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-scripts\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203612 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203653 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-lib-modules\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203718 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203754 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203788 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203882 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-dev\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203962 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.203982 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-ceph\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.207387 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.207484 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-lib-modules\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.207876 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.207920 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.208152 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.208518 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-dev\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.209067 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-run\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.209269 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.209299 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-sys\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.217063 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.221886 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21d77c4c-3493-44c4-b194-6d9dd912d5a1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.224474 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-ceph\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.226987 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-scripts\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.229451 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.230054 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21d77c4c-3493-44c4-b194-6d9dd912d5a1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.268668 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvl2\" (UniqueName: \"kubernetes.io/projected/21d77c4c-3493-44c4-b194-6d9dd912d5a1-kube-api-access-mvvl2\") pod \"cinder-backup-0\" (UID: \"21d77c4c-3493-44c4-b194-6d9dd912d5a1\") " pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.499594 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.505597 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 12:37:55 crc kubenswrapper[4760]: I1204 12:37:55.909814 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ec58b9-74ae-4457-928c-e20bb1c496ec" path="/var/lib/kubelet/pods/43ec58b9-74ae-4457-928c-e20bb1c496ec/volumes" Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.097509 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-658b6c7fb4-vh8cp" podUID="ac9e67e5-eea3-4608-bd30-8483225d28d2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.148825 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-658b6c7fb4-vh8cp" Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.389964 4760 generic.go:334] "Generic (PLEG): container finished" podID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerID="cc4bb714b5e43dd8d68e6969cad10449ea5df26f89281f0554ab6248963395e2" exitCode=0 Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.392490 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerDied","Data":"cc4bb714b5e43dd8d68e6969cad10449ea5df26f89281f0554ab6248963395e2"} Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.413915 4760 generic.go:334] "Generic (PLEG): container finished" podID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerID="6cd4b7442c5d8c26b0f74bce2e01f566fd68b0161b693e247c6a5f2f6624bc3d" exitCode=137 Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.414032 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerDied","Data":"6cd4b7442c5d8c26b0f74bce2e01f566fd68b0161b693e247c6a5f2f6624bc3d"} Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.435149 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.435559 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api-log" containerID="cri-o://04efaff8b456f78b9ea2a869f0869ba1537e11ced6a1f5d00ed73338178777e1" gracePeriod=30 Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.435684 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" containerID="cri-o://5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83" gracePeriod=30 Dec 04 12:37:56 crc kubenswrapper[4760]: I1204 12:37:56.440803 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerStarted","Data":"deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.497759 4760 generic.go:334] "Generic (PLEG): container finished" podID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerID="04efaff8b456f78b9ea2a869f0869ba1537e11ced6a1f5d00ed73338178777e1" exitCode=143 Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.497894 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerDied","Data":"04efaff8b456f78b9ea2a869f0869ba1537e11ced6a1f5d00ed73338178777e1"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.510868 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.542623 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerStarted","Data":"e479c3460aedc4df7d18e03113a64e1bd0b98ef7e9132df4deba1c0408af01ec"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.611789 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-64jjh" event={"ID":"120ed25b-47b0-4306-974b-a66255cac4ce","Type":"ContainerStarted","Data":"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.612474 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.657857 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f9db4bf7b-jdfxz" event={"ID":"94d936ad-6af2-4f94-8d9e-0111032b5cad","Type":"ContainerDied","Data":"9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.658296 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9280322612f3dd921e141c41af730f21eafa9c0d36ddc99ec66c18f8e6a900cb" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.658565 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.677359 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"82e6f33c-9843-4810-9cec-5b7b7525d759","Type":"ContainerStarted","Data":"99b700be413467220a20a5cf5b7961276a386ac0acd461ed2e0005f6b191055a"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.685752 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.703572 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23335f60-d3db-4308-b1fe-a4603a8d65e7","Type":"ContainerDied","Data":"cd3adc897c69dfb42bb6fca985c1000bdf29435695513a637f21554b947b6409"} Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.703660 4760 scope.go:117] "RemoveContainer" containerID="6cd4b7442c5d8c26b0f74bce2e01f566fd68b0161b693e247c6a5f2f6624bc3d" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.720518 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56696ff475-64jjh" podStartSLOduration=8.720490391 podStartE2EDuration="8.720490391s" podCreationTimestamp="2025-12-04 12:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:57.643750305 +0000 UTC m=+1480.685196882" watchObservedRunningTime="2025-12-04 12:37:57.720490391 +0000 UTC m=+1480.761936958" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.802479 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=7.802448262 podStartE2EDuration="7.802448262s" podCreationTimestamp="2025-12-04 12:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:37:57.789533962 +0000 UTC m=+1480.830980529" watchObservedRunningTime="2025-12-04 12:37:57.802448262 +0000 UTC m=+1480.843894829" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806517 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806610 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806656 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806780 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle\") pod \"94d936ad-6af2-4f94-8d9e-0111032b5cad\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806808 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806859 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806891 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4gtd\" (UniqueName: \"kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806934 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs\") pod \"94d936ad-6af2-4f94-8d9e-0111032b5cad\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806963 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd\") pod \"23335f60-d3db-4308-b1fe-a4603a8d65e7\" (UID: \"23335f60-d3db-4308-b1fe-a4603a8d65e7\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.806987 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config\") pod \"94d936ad-6af2-4f94-8d9e-0111032b5cad\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.807012 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config\") pod \"94d936ad-6af2-4f94-8d9e-0111032b5cad\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.807070 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxf2g\" (UniqueName: \"kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g\") pod \"94d936ad-6af2-4f94-8d9e-0111032b5cad\" (UID: \"94d936ad-6af2-4f94-8d9e-0111032b5cad\") " Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.813420 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.815367 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.817660 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.817715 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23335f60-d3db-4308-b1fe-a4603a8d65e7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.831389 4760 scope.go:117] "RemoveContainer" containerID="5e2f33fe8fe3223fef6bc089e77f6416ffe37827017225a0e0eb344ea979a811" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.892145 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts" (OuterVolumeSpecName: "scripts") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.900316 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd" (OuterVolumeSpecName: "kube-api-access-v4gtd") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "kube-api-access-v4gtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.905416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "94d936ad-6af2-4f94-8d9e-0111032b5cad" (UID: "94d936ad-6af2-4f94-8d9e-0111032b5cad"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.919982 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.920436 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4gtd\" (UniqueName: \"kubernetes.io/projected/23335f60-d3db-4308-b1fe-a4603a8d65e7-kube-api-access-v4gtd\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.920529 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.933814 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:57 crc kubenswrapper[4760]: I1204 12:37:57.956988 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g" (OuterVolumeSpecName: "kube-api-access-bxf2g") pod "94d936ad-6af2-4f94-8d9e-0111032b5cad" (UID: "94d936ad-6af2-4f94-8d9e-0111032b5cad"). InnerVolumeSpecName "kube-api-access-bxf2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.026051 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxf2g\" (UniqueName: \"kubernetes.io/projected/94d936ad-6af2-4f94-8d9e-0111032b5cad-kube-api-access-bxf2g\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.026462 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.224445 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.225548 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config" (OuterVolumeSpecName: "config") pod "94d936ad-6af2-4f94-8d9e-0111032b5cad" (UID: "94d936ad-6af2-4f94-8d9e-0111032b5cad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.247096 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.247136 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.251858 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94d936ad-6af2-4f94-8d9e-0111032b5cad" (UID: "94d936ad-6af2-4f94-8d9e-0111032b5cad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.377199 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.857347 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.885354 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"21d77c4c-3493-44c4-b194-6d9dd912d5a1","Type":"ContainerStarted","Data":"f4fc7050283613ea9d969487021570dd2629b5e7ee5e9af940dde9caa67f596f"} Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.909429 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5bf94526-eda5-4784-a223-e0ff51ec09e8","Type":"ContainerStarted","Data":"c4c753651194d766c893c02b55cd4e6bf06e0741f5f14567a23bf98b52ee6ff4"} Dec 04 12:37:58 crc kubenswrapper[4760]: I1204 12:37:58.909959 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f9db4bf7b-jdfxz" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.067378 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.166:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.103115 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data" (OuterVolumeSpecName: "config-data") pod "23335f60-d3db-4308-b1fe-a4603a8d65e7" (UID: "23335f60-d3db-4308-b1fe-a4603a8d65e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.110749 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "94d936ad-6af2-4f94-8d9e-0111032b5cad" (UID: "94d936ad-6af2-4f94-8d9e-0111032b5cad"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.163033 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23335f60-d3db-4308-b1fe-a4603a8d65e7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.164191 4760 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d936ad-6af2-4f94-8d9e-0111032b5cad-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.975517 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerStarted","Data":"3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9"} Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.976005 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api-log" containerID="cri-o://deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9" gracePeriod=30 Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.976481 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 12:37:59 crc kubenswrapper[4760]: I1204 12:37:59.976938 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api" containerID="cri-o://3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9" gracePeriod=30 Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.015807 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=11.015787232 podStartE2EDuration="11.015787232s" podCreationTimestamp="2025-12-04 12:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:00.014893793 +0000 UTC m=+1483.056340370" watchObservedRunningTime="2025-12-04 12:38:00.015787232 +0000 UTC m=+1483.057233789" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.021424 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"21d77c4c-3493-44c4-b194-6d9dd912d5a1","Type":"ContainerStarted","Data":"f84a785b3c9718e53980d1699c63af2cdbbf2325cebf1fe43328e8fc5eeeda53"} Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.276341 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.309907 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.386271 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:00 crc kubenswrapper[4760]: E1204 12:38:00.386816 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-api" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.386840 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-api" Dec 04 12:38:00 crc kubenswrapper[4760]: E1204 12:38:00.386868 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="proxy-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.386878 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="proxy-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: E1204 12:38:00.386916 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="ceilometer-notification-agent" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.386922 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="ceilometer-notification-agent" Dec 04 12:38:00 crc kubenswrapper[4760]: E1204 12:38:00.386931 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.386939 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.387159 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-api" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.387172 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" containerName="neutron-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.387182 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="ceilometer-notification-agent" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.387199 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" containerName="proxy-httpd" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.433351 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.438995 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.444322 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.466011 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.522325 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.545016 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f9db4bf7b-jdfxz"] Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.570472 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42978->10.217.0.161:9311: read: connection reset by peer" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.570495 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fcdc7577b-rkxk5" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:42986->10.217.0.161:9311: read: connection reset by peer" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.589405 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.589680 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hblsz\" (UniqueName: \"kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.589749 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.589850 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.589901 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.590299 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.591549 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720587 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720709 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hblsz\" (UniqueName: \"kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720747 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720823 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720901 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.720920 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.731447 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.731558 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.776720 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.787747 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.793771 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.794439 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.794535 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hblsz\" (UniqueName: \"kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz\") pod \"ceilometer-0\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: I1204 12:38:00.837071 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:00 crc kubenswrapper[4760]: E1204 12:38:00.912476 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda38006e_8a3a_4218_bacf_0e24e2cf9149.slice/crio-conmon-3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda38006e_8a3a_4218_bacf_0e24e2cf9149.slice/crio-deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda38006e_8a3a_4218_bacf_0e24e2cf9149.slice/crio-conmon-deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fef0dd3_e954_4c82_8d4e_82ae547a4b03.slice/crio-5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fef0dd3_e954_4c82_8d4e_82ae547a4b03.slice/crio-conmon-5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda38006e_8a3a_4218_bacf_0e24e2cf9149.slice/crio-3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9.scope\": RecentStats: unable to find data in memory cache]" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.068311 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.083006 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5bf94526-eda5-4784-a223-e0ff51ec09e8","Type":"ContainerStarted","Data":"11f65c7349d2da8723b03b1ac9fff177ce31717bfe3db11bb42d02b8dfc0df26"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.104389 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="82e6f33c-9843-4810-9cec-5b7b7525d759" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.0.174:8080/\": dial tcp 10.217.0.174:8080: connect: connection refused" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.203683 4760 generic.go:334] "Generic (PLEG): container finished" podID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerID="3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9" exitCode=143 Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.203720 4760 generic.go:334] "Generic (PLEG): container finished" podID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerID="deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9" exitCode=143 Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.203774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerDied","Data":"3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.203809 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerDied","Data":"deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.251100 4760 generic.go:334] "Generic (PLEG): container finished" podID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerID="5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83" exitCode=0 Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.251279 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerDied","Data":"5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.268014 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerStarted","Data":"e1ff30901d6ee464d733f0400cd869901d93749267a544fb830d64bf9c0bd933"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.314325 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.314180256 podStartE2EDuration="10.314180256s" podCreationTimestamp="2025-12-04 12:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:01.137322852 +0000 UTC m=+1484.178769419" watchObservedRunningTime="2025-12-04 12:38:01.314180256 +0000 UTC m=+1484.355626823" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.316994 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"21d77c4c-3493-44c4-b194-6d9dd912d5a1","Type":"ContainerStarted","Data":"4615403f436fcfd886ad19742e6858b6ecb8d8e0af6003c91d1cdb650ef2f531"} Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.396119 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=7.396092286 podStartE2EDuration="7.396092286s" podCreationTimestamp="2025-12-04 12:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:01.395381754 +0000 UTC m=+1484.436828321" watchObservedRunningTime="2025-12-04 12:38:01.396092286 +0000 UTC m=+1484.437538843" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.428903 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=10.81980042 podStartE2EDuration="12.428872467s" podCreationTimestamp="2025-12-04 12:37:49 +0000 UTC" firstStartedPulling="2025-12-04 12:37:52.114405236 +0000 UTC m=+1475.155851803" lastFinishedPulling="2025-12-04 12:37:53.723477283 +0000 UTC m=+1476.764923850" observedRunningTime="2025-12-04 12:38:01.30264619 +0000 UTC m=+1484.344092757" watchObservedRunningTime="2025-12-04 12:38:01.428872467 +0000 UTC m=+1484.470319034" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.589677 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.647780 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669370 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669520 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txhd8\" (UniqueName: \"kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669551 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669662 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669853 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669917 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.669952 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle\") pod \"da38006e-8a3a-4218-bacf-0e24e2cf9149\" (UID: \"da38006e-8a3a-4218-bacf-0e24e2cf9149\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.676118 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.699735 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs" (OuterVolumeSpecName: "logs") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.708293 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.709778 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts" (OuterVolumeSpecName: "scripts") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.715024 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8" (OuterVolumeSpecName: "kube-api-access-txhd8") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "kube-api-access-txhd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: W1204 12:38:01.767516 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d73360c_cea6_4c66_88fc_554bda882906.slice/crio-0304a3431d286ab35856afdf02f8fe8b3b9133c197290df9da7ab27c50d6c9d9 WatchSource:0}: Error finding container 0304a3431d286ab35856afdf02f8fe8b3b9133c197290df9da7ab27c50d6c9d9: Status 404 returned error can't find the container with id 0304a3431d286ab35856afdf02f8fe8b3b9133c197290df9da7ab27c50d6c9d9 Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.778146 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data\") pod \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.778255 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs\") pod \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.778301 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle\") pod \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.778499 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hshd2\" (UniqueName: \"kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2\") pod \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.778583 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom\") pod \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\" (UID: \"5fef0dd3-e954-4c82-8d4e-82ae547a4b03\") " Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779135 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779157 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txhd8\" (UniqueName: \"kubernetes.io/projected/da38006e-8a3a-4218-bacf-0e24e2cf9149-kube-api-access-txhd8\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779171 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da38006e-8a3a-4218-bacf-0e24e2cf9149-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779180 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779189 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da38006e-8a3a-4218-bacf-0e24e2cf9149-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.779121 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs" (OuterVolumeSpecName: "logs") pod "5fef0dd3-e954-4c82-8d4e-82ae547a4b03" (UID: "5fef0dd3-e954-4c82-8d4e-82ae547a4b03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.780197 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.794723 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.817856 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2" (OuterVolumeSpecName: "kube-api-access-hshd2") pod "5fef0dd3-e954-4c82-8d4e-82ae547a4b03" (UID: "5fef0dd3-e954-4c82-8d4e-82ae547a4b03"). InnerVolumeSpecName "kube-api-access-hshd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.829741 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5fef0dd3-e954-4c82-8d4e-82ae547a4b03" (UID: "5fef0dd3-e954-4c82-8d4e-82ae547a4b03"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.870454 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data" (OuterVolumeSpecName: "config-data") pod "da38006e-8a3a-4218-bacf-0e24e2cf9149" (UID: "da38006e-8a3a-4218-bacf-0e24e2cf9149"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.895288 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.895333 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.895344 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da38006e-8a3a-4218-bacf-0e24e2cf9149-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.895355 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hshd2\" (UniqueName: \"kubernetes.io/projected/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-kube-api-access-hshd2\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.895365 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.902607 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23335f60-d3db-4308-b1fe-a4603a8d65e7" path="/var/lib/kubelet/pods/23335f60-d3db-4308-b1fe-a4603a8d65e7/volumes" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.903417 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d936ad-6af2-4f94-8d9e-0111032b5cad" path="/var/lib/kubelet/pods/94d936ad-6af2-4f94-8d9e-0111032b5cad/volumes" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.939450 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data" (OuterVolumeSpecName: "config-data") pod "5fef0dd3-e954-4c82-8d4e-82ae547a4b03" (UID: "5fef0dd3-e954-4c82-8d4e-82ae547a4b03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.940007 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fef0dd3-e954-4c82-8d4e-82ae547a4b03" (UID: "5fef0dd3-e954-4c82-8d4e-82ae547a4b03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.998786 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:01 crc kubenswrapper[4760]: I1204 12:38:01.998827 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef0dd3-e954-4c82-8d4e-82ae547a4b03-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.357390 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerStarted","Data":"0304a3431d286ab35856afdf02f8fe8b3b9133c197290df9da7ab27c50d6c9d9"} Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.368095 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"da38006e-8a3a-4218-bacf-0e24e2cf9149","Type":"ContainerDied","Data":"ca092249b89a27342b71e78cd224f85a8c875cbe8efa285849fc230754edccc0"} Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.368168 4760 scope.go:117] "RemoveContainer" containerID="3e7d6674bae82a904fc4f79a0811a8fde254cbb480b05e33d4ba767848c2f1c9" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.368397 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.368933 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.377559 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="5bf94526-eda5-4784-a223-e0ff51ec09e8" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.175:8080/\": dial tcp 10.217.0.175:8080: connect: connection refused" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.403328 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fcdc7577b-rkxk5" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.408442 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fcdc7577b-rkxk5" event={"ID":"5fef0dd3-e954-4c82-8d4e-82ae547a4b03","Type":"ContainerDied","Data":"7fe1ef09ee1da765e3a604e9ec0edd8d4ff53f1d7cf3b3e4fb79c2814c451c56"} Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.416187 4760 scope.go:117] "RemoveContainer" containerID="deec6eec75a0185d1529eea3522e6802c1ebc672473897bf28491540cd5bcdd9" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.419124 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.447374 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.461662 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 12:38:02 crc kubenswrapper[4760]: E1204 12:38:02.463161 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.463369 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api" Dec 04 12:38:02 crc kubenswrapper[4760]: E1204 12:38:02.463477 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.463558 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: E1204 12:38:02.463696 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.463782 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" Dec 04 12:38:02 crc kubenswrapper[4760]: E1204 12:38:02.463924 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.464058 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.464653 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.464836 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.464918 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" containerName="barbican-api" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.465005 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" containerName="manila-api-log" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.467437 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.473272 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.474289 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.475426 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.496344 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.512699 4760 scope.go:117] "RemoveContainer" containerID="5a21f8fa80df7f7cfb02463472f713e42b712688c55aa4f218f71f1491bdcd83" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.530297 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.548331 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7fcdc7577b-rkxk5"] Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.560486 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.560547 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.560636 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data-custom\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.560726 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de257688-7d38-4795-b8a6-36b58bdbc2b8-etc-machine-id\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.560808 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-public-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.561028 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.561105 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-scripts\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.561247 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5kk\" (UniqueName: \"kubernetes.io/projected/de257688-7d38-4795-b8a6-36b58bdbc2b8-kube-api-access-7t5kk\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.561312 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de257688-7d38-4795-b8a6-36b58bdbc2b8-logs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.623368 4760 scope.go:117] "RemoveContainer" containerID="04efaff8b456f78b9ea2a869f0869ba1537e11ced6a1f5d00ed73338178777e1" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-scripts\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671196 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5kk\" (UniqueName: \"kubernetes.io/projected/de257688-7d38-4795-b8a6-36b58bdbc2b8-kube-api-access-7t5kk\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671250 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de257688-7d38-4795-b8a6-36b58bdbc2b8-logs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671304 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671327 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671370 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data-custom\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671407 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de257688-7d38-4795-b8a6-36b58bdbc2b8-etc-machine-id\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-public-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.671521 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.672419 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de257688-7d38-4795-b8a6-36b58bdbc2b8-etc-machine-id\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.673366 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de257688-7d38-4795-b8a6-36b58bdbc2b8-logs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.680191 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data-custom\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.683787 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-public-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.684035 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.685136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-config-data\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.686673 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-scripts\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.708008 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de257688-7d38-4795-b8a6-36b58bdbc2b8-internal-tls-certs\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.715995 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5kk\" (UniqueName: \"kubernetes.io/projected/de257688-7d38-4795-b8a6-36b58bdbc2b8-kube-api-access-7t5kk\") pod \"manila-api-0\" (UID: \"de257688-7d38-4795-b8a6-36b58bdbc2b8\") " pod="openstack/manila-api-0" Dec 04 12:38:02 crc kubenswrapper[4760]: I1204 12:38:02.800772 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 12:38:03 crc kubenswrapper[4760]: I1204 12:38:03.481799 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerStarted","Data":"b1dfedeff10199442aca4e261d67cc87180f8763c01db6af140c574649b7f42e"} Dec 04 12:38:03 crc kubenswrapper[4760]: I1204 12:38:03.884573 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fef0dd3-e954-4c82-8d4e-82ae547a4b03" path="/var/lib/kubelet/pods/5fef0dd3-e954-4c82-8d4e-82ae547a4b03/volumes" Dec 04 12:38:03 crc kubenswrapper[4760]: I1204 12:38:03.886810 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da38006e-8a3a-4218-bacf-0e24e2cf9149" path="/var/lib/kubelet/pods/da38006e-8a3a-4218-bacf-0e24e2cf9149/volumes" Dec 04 12:38:03 crc kubenswrapper[4760]: I1204 12:38:03.941644 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 12:38:04 crc kubenswrapper[4760]: I1204 12:38:04.130167 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.166:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:38:04 crc kubenswrapper[4760]: I1204 12:38:04.163563 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 12:38:04 crc kubenswrapper[4760]: I1204 12:38:04.496370 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"de257688-7d38-4795-b8a6-36b58bdbc2b8","Type":"ContainerStarted","Data":"5b82a936ddb03813b173849acaefd20d1cac2584b1c039e150b2bccdf9ce25f1"} Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.435603 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.509842 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.529278 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="21d77c4c-3493-44c4-b194-6d9dd912d5a1" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.0.176:8080/\": dial tcp 10.217.0.176:8080: connect: connection refused" Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.565802 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.566244 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="dnsmasq-dns" containerID="cri-o://a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521" gracePeriod=10 Dec 04 12:38:05 crc kubenswrapper[4760]: I1204 12:38:05.618934 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"de257688-7d38-4795-b8a6-36b58bdbc2b8","Type":"ContainerStarted","Data":"c1238823693aae5aed434d24874c3196adf549928a387bb5e2916b7f2ca1df46"} Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.396704 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.409747 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5btfv\" (UniqueName: \"kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.409832 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.410042 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.410142 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.410281 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.410426 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0\") pod \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\" (UID: \"fc1d29a4-5ae9-4726-b17c-5a1494bb9240\") " Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.438554 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv" (OuterVolumeSpecName: "kube-api-access-5btfv") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "kube-api-access-5btfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.534051 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5btfv\" (UniqueName: \"kubernetes.io/projected/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-kube-api-access-5btfv\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.667975 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.680360 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.689191 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.697070 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.704882 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerStarted","Data":"dec731349ad60d84bf4026ef2e71dd888e6f369e9b4152fcaf19acb81ed30e23"} Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.725493 4760 generic.go:334] "Generic (PLEG): container finished" podID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerID="a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521" exitCode=0 Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.725604 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" event={"ID":"fc1d29a4-5ae9-4726-b17c-5a1494bb9240","Type":"ContainerDied","Data":"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521"} Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.725643 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" event={"ID":"fc1d29a4-5ae9-4726-b17c-5a1494bb9240","Type":"ContainerDied","Data":"73f028c1745ef8f9a481bbaff344a7eefd4b5e63fc00c588014c50e3f37aaccf"} Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.725663 4760 scope.go:117] "RemoveContainer" containerID="a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.725873 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-cc4k9" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.763354 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.763400 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.763414 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.763425 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.764342 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"de257688-7d38-4795-b8a6-36b58bdbc2b8","Type":"ContainerStarted","Data":"02f43c53b977f874308fbf16622a30737591199af3d473cb651f08d8fd202c03"} Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.771967 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.823318 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.823273952 podStartE2EDuration="4.823273952s" podCreationTimestamp="2025-12-04 12:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:06.810043863 +0000 UTC m=+1489.851490430" watchObservedRunningTime="2025-12-04 12:38:06.823273952 +0000 UTC m=+1489.864720529" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.835914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config" (OuterVolumeSpecName: "config") pod "fc1d29a4-5ae9-4726-b17c-5a1494bb9240" (UID: "fc1d29a4-5ae9-4726-b17c-5a1494bb9240"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.871167 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1d29a4-5ae9-4726-b17c-5a1494bb9240-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.893376 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.893874 4760 scope.go:117] "RemoveContainer" containerID="c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.967514 4760 scope.go:117] "RemoveContainer" containerID="a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521" Dec 04 12:38:06 crc kubenswrapper[4760]: E1204 12:38:06.970739 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521\": container with ID starting with a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521 not found: ID does not exist" containerID="a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.970841 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521"} err="failed to get container status \"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521\": rpc error: code = NotFound desc = could not find container \"a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521\": container with ID starting with a329e1f4129dcdbf77fb1bc2499e1ebf4f64b49cb2dc2babcf824cd431fd9521 not found: ID does not exist" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.970879 4760 scope.go:117] "RemoveContainer" containerID="c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2" Dec 04 12:38:06 crc kubenswrapper[4760]: E1204 12:38:06.971346 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2\": container with ID starting with c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2 not found: ID does not exist" containerID="c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2" Dec 04 12:38:06 crc kubenswrapper[4760]: I1204 12:38:06.971370 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2"} err="failed to get container status \"c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2\": rpc error: code = NotFound desc = could not find container \"c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2\": container with ID starting with c613ec915011b054584c0cb413d367975bc20c0352375bbd3ab1501cb14fcef2 not found: ID does not exist" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.104398 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.129079 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-cc4k9"] Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.787404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerStarted","Data":"11fcf04cef9e88ef54b99cb0f6171bd76b856d2a352a3af6e19b74a93df21a50"} Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.902849 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" path="/var/lib/kubelet/pods/fc1d29a4-5ae9-4726-b17c-5a1494bb9240/volumes" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.905129 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-f6947989-zvg6b"] Dec 04 12:38:07 crc kubenswrapper[4760]: E1204 12:38:07.905599 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="init" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.905618 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="init" Dec 04 12:38:07 crc kubenswrapper[4760]: E1204 12:38:07.905642 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="dnsmasq-dns" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.905649 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="dnsmasq-dns" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.905865 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1d29a4-5ae9-4726-b17c-5a1494bb9240" containerName="dnsmasq-dns" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.907609 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.912956 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.913318 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.915628 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 12:38:07 crc kubenswrapper[4760]: I1204 12:38:07.982310 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f6947989-zvg6b"] Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.024691 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.027767 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-etc-swift\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.027845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-config-data\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.027941 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkjp\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-kube-api-access-hvkjp\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.027996 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-run-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.028033 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-internal-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.028079 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-combined-ca-bundle\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.028179 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-public-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.028271 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-log-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.130790 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkjp\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-kube-api-access-hvkjp\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.130895 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-run-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.130940 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-internal-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.130981 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-combined-ca-bundle\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.131051 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-public-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.131116 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-log-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.131257 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-etc-swift\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.131310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-config-data\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.132059 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-log-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.136027 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b0dacfe-716a-44d3-a653-88fc5183ae97-run-httpd\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.139852 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-public-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.141927 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-internal-tls-certs\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.151383 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-config-data\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.153286 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0dacfe-716a-44d3-a653-88fc5183ae97-combined-ca-bundle\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.155394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-etc-swift\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.157287 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkjp\" (UniqueName: \"kubernetes.io/projected/8b0dacfe-716a-44d3-a653-88fc5183ae97-kube-api-access-hvkjp\") pod \"swift-proxy-f6947989-zvg6b\" (UID: \"8b0dacfe-716a-44d3-a653-88fc5183ae97\") " pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.242793 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.836878 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerStarted","Data":"73ed5907d94c6713726335ddd41a9effa5ccd168fbba02f8aa054a9effbd4b5a"} Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.840921 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.854886 4760 generic.go:334] "Generic (PLEG): container finished" podID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerID="6caefa18364108968ba835bd145212d045b8b3af109c2b69532fb85ba3e91082" exitCode=137 Dec 04 12:38:08 crc kubenswrapper[4760]: I1204 12:38:08.857874 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerDied","Data":"6caefa18364108968ba835bd145212d045b8b3af109c2b69532fb85ba3e91082"} Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.133281 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.87936731 podStartE2EDuration="9.133248038s" podCreationTimestamp="2025-12-04 12:38:00 +0000 UTC" firstStartedPulling="2025-12-04 12:38:01.786470538 +0000 UTC m=+1484.827917105" lastFinishedPulling="2025-12-04 12:38:08.040351266 +0000 UTC m=+1491.081797833" observedRunningTime="2025-12-04 12:38:08.879026839 +0000 UTC m=+1491.920473426" watchObservedRunningTime="2025-12-04 12:38:09.133248038 +0000 UTC m=+1492.174694605" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.138735 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f6947989-zvg6b"] Dec 04 12:38:09 crc kubenswrapper[4760]: W1204 12:38:09.152188 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b0dacfe_716a_44d3_a653_88fc5183ae97.slice/crio-e8ce18be9398ca57fe1e1a87383b7fa596d2a905f4f58cd8c97705dfcc8b3093 WatchSource:0}: Error finding container e8ce18be9398ca57fe1e1a87383b7fa596d2a905f4f58cd8c97705dfcc8b3093: Status 404 returned error can't find the container with id e8ce18be9398ca57fe1e1a87383b7fa596d2a905f4f58cd8c97705dfcc8b3093 Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.369328 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471054 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrhcp\" (UniqueName: \"kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471290 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471340 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471510 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471556 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.471589 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle\") pod \"407783b0-acfe-48e9-87a8-83ba4c28ab08\" (UID: \"407783b0-acfe-48e9-87a8-83ba4c28ab08\") " Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.473101 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.476438 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs" (OuterVolumeSpecName: "logs") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.497293 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.500242 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp" (OuterVolumeSpecName: "kube-api-access-hrhcp") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "kube-api-access-hrhcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.513625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts" (OuterVolumeSpecName: "scripts") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.575137 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.575175 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/407783b0-acfe-48e9-87a8-83ba4c28ab08-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.575186 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/407783b0-acfe-48e9-87a8-83ba4c28ab08-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.575195 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.575204 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrhcp\" (UniqueName: \"kubernetes.io/projected/407783b0-acfe-48e9-87a8-83ba4c28ab08-kube-api-access-hrhcp\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.627401 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.677583 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.736527 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data" (OuterVolumeSpecName: "config-data") pod "407783b0-acfe-48e9-87a8-83ba4c28ab08" (UID: "407783b0-acfe-48e9-87a8-83ba4c28ab08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:09 crc kubenswrapper[4760]: I1204 12:38:09.779805 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407783b0-acfe-48e9-87a8-83ba4c28ab08-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.029425 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f6947989-zvg6b" event={"ID":"8b0dacfe-716a-44d3-a653-88fc5183ae97","Type":"ContainerStarted","Data":"b0539f3fbdc1725219ec0cbd000f4cac8d798ab1cf1d894ae274fb8026749b74"} Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.029488 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f6947989-zvg6b" event={"ID":"8b0dacfe-716a-44d3-a653-88fc5183ae97","Type":"ContainerStarted","Data":"e8ce18be9398ca57fe1e1a87383b7fa596d2a905f4f58cd8c97705dfcc8b3093"} Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.075950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"407783b0-acfe-48e9-87a8-83ba4c28ab08","Type":"ContainerDied","Data":"57521c421cab46d902113e4db9b9d04db07bf9842f026af9a0ff6e300e75500c"} Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.076042 4760 scope.go:117] "RemoveContainer" containerID="6caefa18364108968ba835bd145212d045b8b3af109c2b69532fb85ba3e91082" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.076060 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.153885 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.180902 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.190518 4760 scope.go:117] "RemoveContainer" containerID="886f21ac4a1cc4d6ecfb9de0e981d6b14652a442d16cd965945fd4e25e0bc18a" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.229758 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:38:10 crc kubenswrapper[4760]: E1204 12:38:10.230763 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.230799 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" Dec 04 12:38:10 crc kubenswrapper[4760]: E1204 12:38:10.230835 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api-log" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.230849 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api-log" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.231157 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.231226 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api-log" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.233015 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.249347 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.249445 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.249347 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.255564 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kffm\" (UniqueName: \"kubernetes.io/projected/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-kube-api-access-6kffm\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322584 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-logs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322629 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322742 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322787 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322827 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322891 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-scripts\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322916 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.322985 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.378735 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426168 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426519 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426543 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426611 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-scripts\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426698 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426759 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426810 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kffm\" (UniqueName: \"kubernetes.io/projected/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-kube-api-access-6kffm\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426855 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-logs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.426888 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.428369 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-logs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.434739 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.435454 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.436136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.436381 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-config-data\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.440788 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-scripts\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.441624 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.455361 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kffm\" (UniqueName: \"kubernetes.io/projected/7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd-kube-api-access-6kffm\") pod \"cinder-api-0\" (UID: \"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd\") " pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.579916 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 12:38:10 crc kubenswrapper[4760]: I1204 12:38:10.965167 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.139064 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f6947989-zvg6b" event={"ID":"8b0dacfe-716a-44d3-a653-88fc5183ae97","Type":"ContainerStarted","Data":"b69bdb30a00ebc4710416808a173a15c95d840ca87ad91b870e39af3680d92ba"} Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.139444 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.139578 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:11 crc kubenswrapper[4760]: W1204 12:38:11.171651 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e5e90ab_3e7d_4759_9d7f_56e4837fe9dd.slice/crio-6ca5e81c642500f8984a5f2bc2a2b33553e13626f394adb2a3316ad273c4161a WatchSource:0}: Error finding container 6ca5e81c642500f8984a5f2bc2a2b33553e13626f394adb2a3316ad273c4161a: Status 404 returned error can't find the container with id 6ca5e81c642500f8984a5f2bc2a2b33553e13626f394adb2a3316ad273c4161a Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.172224 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.197159 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-f6947989-zvg6b" podStartSLOduration=4.197115882 podStartE2EDuration="4.197115882s" podCreationTimestamp="2025-12-04 12:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:11.192388651 +0000 UTC m=+1494.233835238" watchObservedRunningTime="2025-12-04 12:38:11.197115882 +0000 UTC m=+1494.238562449" Dec 04 12:38:11 crc kubenswrapper[4760]: I1204 12:38:11.894025 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" path="/var/lib/kubelet/pods/407783b0-acfe-48e9-87a8-83ba4c28ab08/volumes" Dec 04 12:38:12 crc kubenswrapper[4760]: I1204 12:38:12.209319 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd","Type":"ContainerStarted","Data":"6ca5e81c642500f8984a5f2bc2a2b33553e13626f394adb2a3316ad273c4161a"} Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.242556 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd","Type":"ContainerStarted","Data":"c3caed887a387caffb8cd234026bc35972f7636a1279383a12f958b25bba2c85"} Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.850814 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.851570 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-central-agent" containerID="cri-o://b1dfedeff10199442aca4e261d67cc87180f8763c01db6af140c574649b7f42e" gracePeriod=30 Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.852961 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="proxy-httpd" containerID="cri-o://73ed5907d94c6713726335ddd41a9effa5ccd168fbba02f8aa054a9effbd4b5a" gracePeriod=30 Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.853027 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="sg-core" containerID="cri-o://11fcf04cef9e88ef54b99cb0f6171bd76b856d2a352a3af6e19b74a93df21a50" gracePeriod=30 Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.853084 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-notification-agent" containerID="cri-o://dec731349ad60d84bf4026ef2e71dd888e6f369e9b4152fcaf19acb81ed30e23" gracePeriod=30 Dec 04 12:38:13 crc kubenswrapper[4760]: I1204 12:38:13.869470 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="407783b0-acfe-48e9-87a8-83ba4c28ab08" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.166:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.275602 4760 generic.go:334] "Generic (PLEG): container finished" podID="2d73360c-cea6-4c66-88fc-554bda882906" containerID="73ed5907d94c6713726335ddd41a9effa5ccd168fbba02f8aa054a9effbd4b5a" exitCode=0 Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.275961 4760 generic.go:334] "Generic (PLEG): container finished" podID="2d73360c-cea6-4c66-88fc-554bda882906" containerID="11fcf04cef9e88ef54b99cb0f6171bd76b856d2a352a3af6e19b74a93df21a50" exitCode=2 Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.275708 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerDied","Data":"73ed5907d94c6713726335ddd41a9effa5ccd168fbba02f8aa054a9effbd4b5a"} Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.276066 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerDied","Data":"11fcf04cef9e88ef54b99cb0f6171bd76b856d2a352a3af6e19b74a93df21a50"} Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.282841 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd","Type":"ContainerStarted","Data":"f9305974f9e8332ce3f33de58dc1beecc7a4d5b87e460a588c7c0395caf872e5"} Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.283200 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.334874 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.334827342 podStartE2EDuration="4.334827342s" podCreationTimestamp="2025-12-04 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:14.329369369 +0000 UTC m=+1497.370816166" watchObservedRunningTime="2025-12-04 12:38:14.334827342 +0000 UTC m=+1497.376273909" Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.483844 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 12:38:14 crc kubenswrapper[4760]: I1204 12:38:14.550879 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.310782 4760 generic.go:334] "Generic (PLEG): container finished" podID="2d73360c-cea6-4c66-88fc-554bda882906" containerID="dec731349ad60d84bf4026ef2e71dd888e6f369e9b4152fcaf19acb81ed30e23" exitCode=0 Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.312034 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerDied","Data":"dec731349ad60d84bf4026ef2e71dd888e6f369e9b4152fcaf19acb81ed30e23"} Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.322617 4760 generic.go:334] "Generic (PLEG): container finished" podID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerID="7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c" exitCode=137 Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.322712 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerDied","Data":"7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c"} Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.322929 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerStarted","Data":"6a21d1cb04b2f014d317db98e10c64b9c59da5c0cf7192d9412a2d0ed4984f79"} Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.323541 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="manila-scheduler" containerID="cri-o://e479c3460aedc4df7d18e03113a64e1bd0b98ef7e9132df4deba1c0408af01ec" gracePeriod=30 Dec 04 12:38:15 crc kubenswrapper[4760]: I1204 12:38:15.323739 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="probe" containerID="cri-o://e1ff30901d6ee464d733f0400cd869901d93749267a544fb830d64bf9c0bd933" gracePeriod=30 Dec 04 12:38:16 crc kubenswrapper[4760]: I1204 12:38:16.343024 4760 generic.go:334] "Generic (PLEG): container finished" podID="556dacca-8542-42df-97e5-a09db3716d3f" containerID="e1ff30901d6ee464d733f0400cd869901d93749267a544fb830d64bf9c0bd933" exitCode=0 Dec 04 12:38:16 crc kubenswrapper[4760]: I1204 12:38:16.343194 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerDied","Data":"e1ff30901d6ee464d733f0400cd869901d93749267a544fb830d64bf9c0bd933"} Dec 04 12:38:16 crc kubenswrapper[4760]: I1204 12:38:16.355306 4760 generic.go:334] "Generic (PLEG): container finished" podID="2d73360c-cea6-4c66-88fc-554bda882906" containerID="b1dfedeff10199442aca4e261d67cc87180f8763c01db6af140c574649b7f42e" exitCode=0 Dec 04 12:38:16 crc kubenswrapper[4760]: I1204 12:38:16.355366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerDied","Data":"b1dfedeff10199442aca4e261d67cc87180f8763c01db6af140c574649b7f42e"} Dec 04 12:38:18 crc kubenswrapper[4760]: I1204 12:38:18.252604 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:18 crc kubenswrapper[4760]: I1204 12:38:18.256099 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f6947989-zvg6b" Dec 04 12:38:20 crc kubenswrapper[4760]: I1204 12:38:20.451363 4760 generic.go:334] "Generic (PLEG): container finished" podID="556dacca-8542-42df-97e5-a09db3716d3f" containerID="e479c3460aedc4df7d18e03113a64e1bd0b98ef7e9132df4deba1c0408af01ec" exitCode=0 Dec 04 12:38:20 crc kubenswrapper[4760]: I1204 12:38:20.451659 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerDied","Data":"e479c3460aedc4df7d18e03113a64e1bd0b98ef7e9132df4deba1c0408af01ec"} Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.175380 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.176309 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.177320 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.660880 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.180:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.670110 4760 generic.go:334] "Generic (PLEG): container finished" podID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerID="850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7" exitCode=137 Dec 04 12:38:24 crc kubenswrapper[4760]: I1204 12:38:24.670182 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerDied","Data":"850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7"} Dec 04 12:38:25 crc kubenswrapper[4760]: I1204 12:38:25.584677 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.180:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:38:25 crc kubenswrapper[4760]: I1204 12:38:25.925195 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 04 12:38:29 crc kubenswrapper[4760]: I1204 12:38:29.896713 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 12:38:33 crc kubenswrapper[4760]: E1204 12:38:33.275158 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 04 12:38:33 crc kubenswrapper[4760]: E1204 12:38:33.275895 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n6ch67fhbfh55bhfh56dh9chf7h65bhd6h587h64hb9h6fh67bhf9h648h685h7bh654h67bh84h5dh88h8dh554h58fh67h5bbh695h56hbq,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7z4d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(edbb46fc-c8ec-45c9-bdb2-36639d92402e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 12:38:33 crc kubenswrapper[4760]: E1204 12:38:33.277534 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.621802 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.656325 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.659478 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.659558 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.659798 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.659858 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.659969 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hblsz\" (UniqueName: \"kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.660039 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.660249 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd\") pod \"2d73360c-cea6-4c66-88fc-554bda882906\" (UID: \"2d73360c-cea6-4c66-88fc-554bda882906\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.663717 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.665616 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.671030 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.671073 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d73360c-cea6-4c66-88fc-554bda882906-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.678686 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts" (OuterVolumeSpecName: "scripts") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.722544 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz" (OuterVolumeSpecName: "kube-api-access-hblsz") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "kube-api-access-hblsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.773716 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.773913 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.773958 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.774036 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.774327 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.774398 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rz2k\" (UniqueName: \"kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k\") pod \"556dacca-8542-42df-97e5-a09db3716d3f\" (UID: \"556dacca-8542-42df-97e5-a09db3716d3f\") " Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.775584 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hblsz\" (UniqueName: \"kubernetes.io/projected/2d73360c-cea6-4c66-88fc-554bda882906-kube-api-access-hblsz\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.778013 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.780363 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.793685 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts" (OuterVolumeSpecName: "scripts") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.794134 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k" (OuterVolumeSpecName: "kube-api-access-9rz2k") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "kube-api-access-9rz2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.799031 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.860854 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.881076 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.881117 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.881131 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rz2k\" (UniqueName: \"kubernetes.io/projected/556dacca-8542-42df-97e5-a09db3716d3f-kube-api-access-9rz2k\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.881140 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/556dacca-8542-42df-97e5-a09db3716d3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.881149 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.904579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"556dacca-8542-42df-97e5-a09db3716d3f","Type":"ContainerDied","Data":"2b50fc9d0052371ad1eb69f62a874934a543ee02474dc0bfef93f413527e945c"} Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.904660 4760 scope.go:117] "RemoveContainer" containerID="e1ff30901d6ee464d733f0400cd869901d93749267a544fb830d64bf9c0bd933" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.905038 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.927508 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.927612 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d73360c-cea6-4c66-88fc-554bda882906","Type":"ContainerDied","Data":"0304a3431d286ab35856afdf02f8fe8b3b9133c197290df9da7ab27c50d6c9d9"} Dec 04 12:38:33 crc kubenswrapper[4760]: E1204 12:38:33.948280 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="edbb46fc-c8ec-45c9-bdb2-36639d92402e" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.979379 4760 scope.go:117] "RemoveContainer" containerID="e479c3460aedc4df7d18e03113a64e1bd0b98ef7e9132df4deba1c0408af01ec" Dec 04 12:38:33 crc kubenswrapper[4760]: I1204 12:38:33.989031 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.002442 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.010424 4760 scope.go:117] "RemoveContainer" containerID="73ed5907d94c6713726335ddd41a9effa5ccd168fbba02f8aa054a9effbd4b5a" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.011588 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data" (OuterVolumeSpecName: "config-data") pod "2d73360c-cea6-4c66-88fc-554bda882906" (UID: "2d73360c-cea6-4c66-88fc-554bda882906"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.046463 4760 scope.go:117] "RemoveContainer" containerID="11fcf04cef9e88ef54b99cb0f6171bd76b856d2a352a3af6e19b74a93df21a50" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.086758 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.086813 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.086830 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d73360c-cea6-4c66-88fc-554bda882906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.090161 4760 scope.go:117] "RemoveContainer" containerID="dec731349ad60d84bf4026ef2e71dd888e6f369e9b4152fcaf19acb81ed30e23" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.093096 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data" (OuterVolumeSpecName: "config-data") pod "556dacca-8542-42df-97e5-a09db3716d3f" (UID: "556dacca-8542-42df-97e5-a09db3716d3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.121396 4760 scope.go:117] "RemoveContainer" containerID="b1dfedeff10199442aca4e261d67cc87180f8763c01db6af140c574649b7f42e" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.182826 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.189193 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/556dacca-8542-42df-97e5-a09db3716d3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.302366 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.334100 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.347016 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.347747 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-central-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.347770 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-central-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.347961 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="sg-core" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.347969 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="sg-core" Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.347987 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="manila-scheduler" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.347993 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="manila-scheduler" Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.348006 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="probe" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348014 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="probe" Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.348031 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-notification-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348037 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-notification-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: E1204 12:38:34.348062 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="proxy-httpd" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348068 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="proxy-httpd" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348337 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="proxy-httpd" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348348 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="sg-core" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348368 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-notification-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348387 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="ceilometer-central-agent" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348395 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="manila-scheduler" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.348407 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="556dacca-8542-42df-97e5-a09db3716d3f" containerName="probe" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.350525 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.364538 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.369887 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.413366 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.413646 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-scripts\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.413863 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.413934 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.413981 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da99ecae-3ef9-484d-a420-0317df7654d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.414172 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvppb\" (UniqueName: \"kubernetes.io/projected/da99ecae-3ef9-484d-a420-0317df7654d5-kube-api-access-zvppb\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.476493 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.485023 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.495402 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.498532 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.507168 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.507489 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.508100 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521427 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvppb\" (UniqueName: \"kubernetes.io/projected/da99ecae-3ef9-484d-a420-0317df7654d5-kube-api-access-zvppb\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521594 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521671 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-scripts\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521742 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521835 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da99ecae-3ef9-484d-a420-0317df7654d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.521955 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da99ecae-3ef9-484d-a420-0317df7654d5-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.529813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.530845 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-scripts\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.535063 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.535642 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da99ecae-3ef9-484d-a420-0317df7654d5-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.552922 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvppb\" (UniqueName: \"kubernetes.io/projected/da99ecae-3ef9-484d-a420-0317df7654d5-kube-api-access-zvppb\") pod \"manila-scheduler-0\" (UID: \"da99ecae-3ef9-484d-a420-0317df7654d5\") " pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625065 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgcq\" (UniqueName: \"kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625163 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625185 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625242 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625431 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.625477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.706730 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.729676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgcq\" (UniqueName: \"kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.733778 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.733836 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.733940 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.734099 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.734295 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.734408 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.736473 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.737555 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.741322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.744202 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.745943 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.761177 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgcq\" (UniqueName: \"kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.762421 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data\") pod \"ceilometer-0\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.937906 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.964354 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerStarted","Data":"8e52f36f78c1cf2755bdaa578909f50d145f6968d91b7e131117e2b9c9273634"} Dec 04 12:38:34 crc kubenswrapper[4760]: I1204 12:38:34.970947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerStarted","Data":"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba"} Dec 04 12:38:35 crc kubenswrapper[4760]: I1204 12:38:35.444770 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 12:38:35 crc kubenswrapper[4760]: W1204 12:38:35.452422 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda99ecae_3ef9_484d_a420_0317df7654d5.slice/crio-1cbdf9732b32ff62901cc0c8676326c8227ca0ac83df9cf32e329c5dbacc3265 WatchSource:0}: Error finding container 1cbdf9732b32ff62901cc0c8676326c8227ca0ac83df9cf32e329c5dbacc3265: Status 404 returned error can't find the container with id 1cbdf9732b32ff62901cc0c8676326c8227ca0ac83df9cf32e329c5dbacc3265 Dec 04 12:38:35 crc kubenswrapper[4760]: I1204 12:38:35.652512 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:35 crc kubenswrapper[4760]: I1204 12:38:35.918310 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d73360c-cea6-4c66-88fc-554bda882906" path="/var/lib/kubelet/pods/2d73360c-cea6-4c66-88fc-554bda882906/volumes" Dec 04 12:38:36 crc kubenswrapper[4760]: I1204 12:38:36.166455 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="556dacca-8542-42df-97e5-a09db3716d3f" path="/var/lib/kubelet/pods/556dacca-8542-42df-97e5-a09db3716d3f/volumes" Dec 04 12:38:36 crc kubenswrapper[4760]: I1204 12:38:36.167789 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da99ecae-3ef9-484d-a420-0317df7654d5","Type":"ContainerStarted","Data":"1cbdf9732b32ff62901cc0c8676326c8227ca0ac83df9cf32e329c5dbacc3265"} Dec 04 12:38:36 crc kubenswrapper[4760]: I1204 12:38:36.167842 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerStarted","Data":"d09b3d7691c24a637c42570a3d8c74239e3039c6a5e79e374abc2403489c3891"} Dec 04 12:38:36 crc kubenswrapper[4760]: I1204 12:38:36.167861 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerStarted","Data":"bbe1475238ba85f9fce17c103211194f1e48d76b96dcdee34f6324c6e2e01810"} Dec 04 12:38:37 crc kubenswrapper[4760]: I1204 12:38:37.036926 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=6.391283615 podStartE2EDuration="48.036903088s" podCreationTimestamp="2025-12-04 12:37:49 +0000 UTC" firstStartedPulling="2025-12-04 12:37:51.682908659 +0000 UTC m=+1474.724355216" lastFinishedPulling="2025-12-04 12:38:33.328528112 +0000 UTC m=+1516.369974689" observedRunningTime="2025-12-04 12:38:37.031752705 +0000 UTC m=+1520.073199282" watchObservedRunningTime="2025-12-04 12:38:37.036903088 +0000 UTC m=+1520.078349655" Dec 04 12:38:38 crc kubenswrapper[4760]: I1204 12:38:38.026635 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da99ecae-3ef9-484d-a420-0317df7654d5","Type":"ContainerStarted","Data":"132b162e838beb493fe42b7e0aa7b6d29a45c49ab52d7c9727962e67db93a8fc"} Dec 04 12:38:39 crc kubenswrapper[4760]: I1204 12:38:39.039914 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerStarted","Data":"0f1949c984123609dea6fc2dc1352f99007579d0af8ff58ef99997830f6e72c7"} Dec 04 12:38:39 crc kubenswrapper[4760]: I1204 12:38:39.043723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da99ecae-3ef9-484d-a420-0317df7654d5","Type":"ContainerStarted","Data":"2fd94134191319c18223b0a782b03186b06586471f16cf9285371a139e8feaa1"} Dec 04 12:38:39 crc kubenswrapper[4760]: I1204 12:38:39.078989 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.078963369 podStartE2EDuration="5.078963369s" podCreationTimestamp="2025-12-04 12:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:39.070611994 +0000 UTC m=+1522.112058571" watchObservedRunningTime="2025-12-04 12:38:39.078963369 +0000 UTC m=+1522.120409936" Dec 04 12:38:40 crc kubenswrapper[4760]: I1204 12:38:40.087759 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 12:38:43 crc kubenswrapper[4760]: I1204 12:38:43.091407 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerStarted","Data":"7e9f6e8dcedf5ef24ca9231cd09e9bb54318c296e1ab2b58b8ce7560c92d5cd9"} Dec 04 12:38:43 crc kubenswrapper[4760]: I1204 12:38:43.913325 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:38:43 crc kubenswrapper[4760]: I1204 12:38:43.913817 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:38:44 crc kubenswrapper[4760]: I1204 12:38:44.175472 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:38:44 crc kubenswrapper[4760]: I1204 12:38:44.175612 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:38:44 crc kubenswrapper[4760]: I1204 12:38:44.176938 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"6a21d1cb04b2f014d317db98e10c64b9c59da5c0cf7192d9412a2d0ed4984f79"} pod="openstack/horizon-5b7fc6c944-sh7tv" containerMessage="Container horizon failed startup probe, will be restarted" Dec 04 12:38:44 crc kubenswrapper[4760]: I1204 12:38:44.176994 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" containerID="cri-o://6a21d1cb04b2f014d317db98e10c64b9c59da5c0cf7192d9412a2d0ed4984f79" gracePeriod=30 Dec 04 12:38:44 crc kubenswrapper[4760]: I1204 12:38:44.707785 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 12:38:45 crc kubenswrapper[4760]: I1204 12:38:45.119171 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerStarted","Data":"b7e276e77618243cc3d581a472a7fe921ea0c8c0701214c2c24ce96d9d6c4dd9"} Dec 04 12:38:48 crc kubenswrapper[4760]: I1204 12:38:48.165535 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerStarted","Data":"bab21263166473b2fa9474ebceda1ed49d6836b279e59332f8129dda0c61b6a1"} Dec 04 12:38:48 crc kubenswrapper[4760]: I1204 12:38:48.166107 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:38:48 crc kubenswrapper[4760]: I1204 12:38:48.197534 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.642715585 podStartE2EDuration="14.197503269s" podCreationTimestamp="2025-12-04 12:38:34 +0000 UTC" firstStartedPulling="2025-12-04 12:38:35.659398492 +0000 UTC m=+1518.700845059" lastFinishedPulling="2025-12-04 12:38:47.214186176 +0000 UTC m=+1530.255632743" observedRunningTime="2025-12-04 12:38:48.194397511 +0000 UTC m=+1531.235844088" watchObservedRunningTime="2025-12-04 12:38:48.197503269 +0000 UTC m=+1531.238949836" Dec 04 12:38:49 crc kubenswrapper[4760]: I1204 12:38:49.894719 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.186037 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-central-agent" containerID="cri-o://0f1949c984123609dea6fc2dc1352f99007579d0af8ff58ef99997830f6e72c7" gracePeriod=30 Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.186161 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="proxy-httpd" containerID="cri-o://bab21263166473b2fa9474ebceda1ed49d6836b279e59332f8129dda0c61b6a1" gracePeriod=30 Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.186083 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-notification-agent" containerID="cri-o://7e9f6e8dcedf5ef24ca9231cd09e9bb54318c296e1ab2b58b8ce7560c92d5cd9" gracePeriod=30 Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.186110 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="sg-core" containerID="cri-o://b7e276e77618243cc3d581a472a7fe921ea0c8c0701214c2c24ce96d9d6c4dd9" gracePeriod=30 Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.273841 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7vhlm"] Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.275719 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:50 crc kubenswrapper[4760]: I1204 12:38:50.300798 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7vhlm"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.411196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.411357 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7npv\" (UniqueName: \"kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.518075 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7npv\" (UniqueName: \"kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.518337 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.519323 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.571042 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7npv\" (UniqueName: \"kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv\") pod \"nova-api-db-create-7vhlm\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.616597 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.624255 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nszzz"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.626312 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.679539 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nszzz"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.718502 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.719279 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-log" containerID="cri-o://687679c5e5a0e7a203ada44ea9f444017276f4c4fbac1d624bf04a251f7494e0" gracePeriod=30 Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.719523 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-httpd" containerID="cri-o://b4890983a20fcaf634982b513116b99dbc34aa74525b3efaf48e5a5d4349a66b" gracePeriod=30 Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.763535 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ws6\" (UniqueName: \"kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.764438 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.869496 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.869594 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ws6\" (UniqueName: \"kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.877398 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5f284"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.879361 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.881367 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.932290 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ws6\" (UniqueName: \"kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6\") pod \"nova-cell0-db-create-nszzz\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.949595 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6c68-account-create-update-96z8z"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.951914 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.967915 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.971498 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.971681 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nfqc\" (UniqueName: \"kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:50.986152 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5f284"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.010445 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c68-account-create-update-96z8z"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.029936 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9fc7-account-create-update-6ctmc"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.041144 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.048586 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.067286 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9fc7-account-create-update-6ctmc"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.075765 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.075894 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.076056 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfp9\" (UniqueName: \"kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.076166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfqc\" (UniqueName: \"kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.077035 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.109424 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9784-account-create-update-vw2jz"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.111109 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfqc\" (UniqueName: \"kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc\") pod \"nova-cell1-db-create-5f284\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.113073 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.119392 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.132200 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9784-account-create-update-vw2jz"] Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.183925 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.184053 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.184296 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.184348 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.184445 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6gzf\" (UniqueName: \"kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.184476 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfp9\" (UniqueName: \"kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.186026 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.223455 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfp9\" (UniqueName: \"kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9\") pod \"nova-api-6c68-account-create-update-96z8z\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.246556 4760 generic.go:334] "Generic (PLEG): container finished" podID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerID="b7e276e77618243cc3d581a472a7fe921ea0c8c0701214c2c24ce96d9d6c4dd9" exitCode=2 Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.246733 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerDied","Data":"b7e276e77618243cc3d581a472a7fe921ea0c8c0701214c2c24ce96d9d6c4dd9"} Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.270814 4760 generic.go:334] "Generic (PLEG): container finished" podID="ead32194-7c87-4c05-99b6-55a928499e0d" containerID="687679c5e5a0e7a203ada44ea9f444017276f4c4fbac1d624bf04a251f7494e0" exitCode=143 Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.271196 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerDied","Data":"687679c5e5a0e7a203ada44ea9f444017276f4c4fbac1d624bf04a251f7494e0"} Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.287913 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.288059 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.288233 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.288347 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6gzf\" (UniqueName: \"kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.291993 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.292604 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.329034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct\") pod \"nova-cell0-9fc7-account-create-update-6ctmc\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.333050 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6gzf\" (UniqueName: \"kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf\") pod \"nova-cell1-9784-account-create-update-vw2jz\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.421128 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.553971 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.595854 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.655745 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.676441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:51 crc kubenswrapper[4760]: I1204 12:38:51.684290 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7vhlm"] Dec 04 12:38:51 crc kubenswrapper[4760]: W1204 12:38:51.745468 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36730972_a8b8_4dd7_8335_805b4b694e42.slice/crio-acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e WatchSource:0}: Error finding container acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e: Status 404 returned error can't find the container with id acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e Dec 04 12:38:52 crc kubenswrapper[4760]: W1204 12:38:52.145394 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9053d7ed_6d5e_44fe_ac2d_17ee4719a590.slice/crio-4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93 WatchSource:0}: Error finding container 4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93: Status 404 returned error can't find the container with id 4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93 Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.175381 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nszzz"] Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.329963 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5f284"] Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.357502 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7vhlm" event={"ID":"36730972-a8b8-4dd7-8335-805b4b694e42","Type":"ContainerStarted","Data":"acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e"} Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.379713 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nszzz" event={"ID":"9053d7ed-6d5e-44fe-ac2d-17ee4719a590","Type":"ContainerStarted","Data":"4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93"} Dec 04 12:38:52 crc kubenswrapper[4760]: W1204 12:38:52.386472 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b4fdc4d_3353_4e41_8218_282fef7f1418.slice/crio-8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589 WatchSource:0}: Error finding container 8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589: Status 404 returned error can't find the container with id 8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589 Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.405313 4760 generic.go:334] "Generic (PLEG): container finished" podID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerID="bab21263166473b2fa9474ebceda1ed49d6836b279e59332f8129dda0c61b6a1" exitCode=0 Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.405349 4760 generic.go:334] "Generic (PLEG): container finished" podID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerID="7e9f6e8dcedf5ef24ca9231cd09e9bb54318c296e1ab2b58b8ce7560c92d5cd9" exitCode=0 Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.405399 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerDied","Data":"bab21263166473b2fa9474ebceda1ed49d6836b279e59332f8129dda0c61b6a1"} Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.405434 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerDied","Data":"7e9f6e8dcedf5ef24ca9231cd09e9bb54318c296e1ab2b58b8ce7560c92d5cd9"} Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.415150 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"edbb46fc-c8ec-45c9-bdb2-36639d92402e","Type":"ContainerStarted","Data":"76ed0fd56e872e1ea6f0d9f8e6091e7594ae7d68d60f5ad9c98cc04efc391549"} Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.445107 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9fc7-account-create-update-6ctmc"] Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.471040 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.004307858 podStartE2EDuration="1m5.470997393s" podCreationTimestamp="2025-12-04 12:37:47 +0000 UTC" firstStartedPulling="2025-12-04 12:37:49.077075702 +0000 UTC m=+1472.118522269" lastFinishedPulling="2025-12-04 12:38:50.543765237 +0000 UTC m=+1533.585211804" observedRunningTime="2025-12-04 12:38:52.444462221 +0000 UTC m=+1535.485908788" watchObservedRunningTime="2025-12-04 12:38:52.470997393 +0000 UTC m=+1535.512443960" Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.552333 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9784-account-create-update-vw2jz"] Dec 04 12:38:52 crc kubenswrapper[4760]: I1204 12:38:52.594136 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6c68-account-create-update-96z8z"] Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.433644 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nszzz" event={"ID":"9053d7ed-6d5e-44fe-ac2d-17ee4719a590","Type":"ContainerStarted","Data":"94a423a2ffd13e04dbd4b909957b6c76b3ad60e18a51e38a6d7bfcbb18e243a9"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.435723 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" event={"ID":"9e0b6f32-b982-4267-a0b8-b977b91f187c","Type":"ContainerStarted","Data":"10ecf2515015ca9ef4e4759055a22fa3131ff6f83c7b98b7545f64f888176d88"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.435782 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" event={"ID":"9e0b6f32-b982-4267-a0b8-b977b91f187c","Type":"ContainerStarted","Data":"764543efe2a5a7358f4331feb4e918771d38c2668616f3e0c6766b1415c83cec"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.438441 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.438543 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c68-account-create-update-96z8z" event={"ID":"87816d08-ea62-428d-a43e-40b3e030afb5","Type":"ContainerStarted","Data":"63486cd05984cc68b3dc2624522f8411eb759180ab672a45d3d30cd335427950"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.438580 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c68-account-create-update-96z8z" event={"ID":"87816d08-ea62-428d-a43e-40b3e030afb5","Type":"ContainerStarted","Data":"f488217e675d0da102515e7131a14dc853acc257883e76721bd858bf4fa86a10"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.442385 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" event={"ID":"b417eac6-2ecb-42b0-a9ad-23860eaefde3","Type":"ContainerStarted","Data":"51d4a3aaa35392c3a3ba9d4e0a4396713a8d4da9ffefd319c448ad7cfea5c85f"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.442438 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" event={"ID":"b417eac6-2ecb-42b0-a9ad-23860eaefde3","Type":"ContainerStarted","Data":"53d5c36d0b2cb74c5a1081fd0bf0b59e75daea5dde5dc34a53884a07209ea211"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.448619 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7vhlm" event={"ID":"36730972-a8b8-4dd7-8335-805b4b694e42","Type":"ContainerStarted","Data":"b62f6d2b0afedecc875b3ba8b0f7a8c578b971238d9e7cbeb5f09a19e8f27452"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.450949 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5f284" event={"ID":"3b4fdc4d-3353-4e41-8218-282fef7f1418","Type":"ContainerStarted","Data":"eaf187efe64777a275756f0b4323aef50b158ed5d1768e9779421cbbe1d68feb"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.450994 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5f284" event={"ID":"3b4fdc4d-3353-4e41-8218-282fef7f1418","Type":"ContainerStarted","Data":"8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589"} Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.460824 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-nszzz" podStartSLOduration=3.4607709509999998 podStartE2EDuration="3.460770951s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.452556101 +0000 UTC m=+1536.494002678" watchObservedRunningTime="2025-12-04 12:38:53.460770951 +0000 UTC m=+1536.502217518" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.473690 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-7vhlm" podStartSLOduration=3.473653971 podStartE2EDuration="3.473653971s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.470461059 +0000 UTC m=+1536.511907616" watchObservedRunningTime="2025-12-04 12:38:53.473653971 +0000 UTC m=+1536.515100538" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.509579 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5f284" podStartSLOduration=3.50954781 podStartE2EDuration="3.50954781s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.487241592 +0000 UTC m=+1536.528688179" watchObservedRunningTime="2025-12-04 12:38:53.50954781 +0000 UTC m=+1536.550994377" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.524170 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6c68-account-create-update-96z8z" podStartSLOduration=3.523899286 podStartE2EDuration="3.523899286s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.508676472 +0000 UTC m=+1536.550123039" watchObservedRunningTime="2025-12-04 12:38:53.523899286 +0000 UTC m=+1536.565345863" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.541079 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" podStartSLOduration=3.5410436499999998 podStartE2EDuration="3.54104365s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.533357325 +0000 UTC m=+1536.574803912" watchObservedRunningTime="2025-12-04 12:38:53.54104365 +0000 UTC m=+1536.582490207" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.590505 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" podStartSLOduration=3.590471599 podStartE2EDuration="3.590471599s" podCreationTimestamp="2025-12-04 12:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:38:53.554722764 +0000 UTC m=+1536.596169321" watchObservedRunningTime="2025-12-04 12:38:53.590471599 +0000 UTC m=+1536.631918176" Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.624029 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:53 crc kubenswrapper[4760]: I1204 12:38:53.915584 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.468462 4760 generic.go:334] "Generic (PLEG): container finished" podID="87816d08-ea62-428d-a43e-40b3e030afb5" containerID="63486cd05984cc68b3dc2624522f8411eb759180ab672a45d3d30cd335427950" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.468579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c68-account-create-update-96z8z" event={"ID":"87816d08-ea62-428d-a43e-40b3e030afb5","Type":"ContainerDied","Data":"63486cd05984cc68b3dc2624522f8411eb759180ab672a45d3d30cd335427950"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.471606 4760 generic.go:334] "Generic (PLEG): container finished" podID="b417eac6-2ecb-42b0-a9ad-23860eaefde3" containerID="51d4a3aaa35392c3a3ba9d4e0a4396713a8d4da9ffefd319c448ad7cfea5c85f" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.471708 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" event={"ID":"b417eac6-2ecb-42b0-a9ad-23860eaefde3","Type":"ContainerDied","Data":"51d4a3aaa35392c3a3ba9d4e0a4396713a8d4da9ffefd319c448ad7cfea5c85f"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.474082 4760 generic.go:334] "Generic (PLEG): container finished" podID="36730972-a8b8-4dd7-8335-805b4b694e42" containerID="b62f6d2b0afedecc875b3ba8b0f7a8c578b971238d9e7cbeb5f09a19e8f27452" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.474173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7vhlm" event={"ID":"36730972-a8b8-4dd7-8335-805b4b694e42","Type":"ContainerDied","Data":"b62f6d2b0afedecc875b3ba8b0f7a8c578b971238d9e7cbeb5f09a19e8f27452"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.479695 4760 generic.go:334] "Generic (PLEG): container finished" podID="3b4fdc4d-3353-4e41-8218-282fef7f1418" containerID="eaf187efe64777a275756f0b4323aef50b158ed5d1768e9779421cbbe1d68feb" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.480000 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5f284" event={"ID":"3b4fdc4d-3353-4e41-8218-282fef7f1418","Type":"ContainerDied","Data":"eaf187efe64777a275756f0b4323aef50b158ed5d1768e9779421cbbe1d68feb"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.495421 4760 generic.go:334] "Generic (PLEG): container finished" podID="9053d7ed-6d5e-44fe-ac2d-17ee4719a590" containerID="94a423a2ffd13e04dbd4b909957b6c76b3ad60e18a51e38a6d7bfcbb18e243a9" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.495568 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nszzz" event={"ID":"9053d7ed-6d5e-44fe-ac2d-17ee4719a590","Type":"ContainerDied","Data":"94a423a2ffd13e04dbd4b909957b6c76b3ad60e18a51e38a6d7bfcbb18e243a9"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.515130 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": dial tcp 10.217.0.151:9292: connect: connection refused" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.515310 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9292/healthcheck\": dial tcp 10.217.0.151:9292: connect: connection refused" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.518445 4760 generic.go:334] "Generic (PLEG): container finished" podID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerID="0f1949c984123609dea6fc2dc1352f99007579d0af8ff58ef99997830f6e72c7" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.518545 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerDied","Data":"0f1949c984123609dea6fc2dc1352f99007579d0af8ff58ef99997830f6e72c7"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.525657 4760 generic.go:334] "Generic (PLEG): container finished" podID="9e0b6f32-b982-4267-a0b8-b977b91f187c" containerID="10ecf2515015ca9ef4e4759055a22fa3131ff6f83c7b98b7545f64f888176d88" exitCode=0 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.525948 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" event={"ID":"9e0b6f32-b982-4267-a0b8-b977b91f187c","Type":"ContainerDied","Data":"10ecf2515015ca9ef4e4759055a22fa3131ff6f83c7b98b7545f64f888176d88"} Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.526288 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="manila-share" containerID="cri-o://8e52f36f78c1cf2755bdaa578909f50d145f6968d91b7e131117e2b9c9273634" gracePeriod=30 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.526587 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="probe" containerID="cri-o://d09b3d7691c24a637c42570a3d8c74239e3039c6a5e79e374abc2403489c3891" gracePeriod=30 Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.777516 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866246 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brgcq\" (UniqueName: \"kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866304 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866462 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866557 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866603 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866728 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.866823 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd\") pod \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\" (UID: \"9062b0c5-1053-4a92-a4d1-0830846e5e7f\") " Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.867992 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.868346 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.874448 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq" (OuterVolumeSpecName: "kube-api-access-brgcq") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "kube-api-access-brgcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.875014 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts" (OuterVolumeSpecName: "scripts") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.910291 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.970720 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.971053 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.971068 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.971083 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9062b0c5-1053-4a92-a4d1-0830846e5e7f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.971094 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brgcq\" (UniqueName: \"kubernetes.io/projected/9062b0c5-1053-4a92-a4d1-0830846e5e7f-kube-api-access-brgcq\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:54 crc kubenswrapper[4760]: I1204 12:38:54.978978 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.025469 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data" (OuterVolumeSpecName: "config-data") pod "9062b0c5-1053-4a92-a4d1-0830846e5e7f" (UID: "9062b0c5-1053-4a92-a4d1-0830846e5e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.073145 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.073630 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9062b0c5-1053-4a92-a4d1-0830846e5e7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.548192 4760 generic.go:334] "Generic (PLEG): container finished" podID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerID="d09b3d7691c24a637c42570a3d8c74239e3039c6a5e79e374abc2403489c3891" exitCode=0 Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.548346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerDied","Data":"d09b3d7691c24a637c42570a3d8c74239e3039c6a5e79e374abc2403489c3891"} Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.572833 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9062b0c5-1053-4a92-a4d1-0830846e5e7f","Type":"ContainerDied","Data":"bbe1475238ba85f9fce17c103211194f1e48d76b96dcdee34f6324c6e2e01810"} Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.573198 4760 scope.go:117] "RemoveContainer" containerID="bab21263166473b2fa9474ebceda1ed49d6836b279e59332f8129dda0c61b6a1" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.573583 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.588374 4760 generic.go:334] "Generic (PLEG): container finished" podID="ead32194-7c87-4c05-99b6-55a928499e0d" containerID="b4890983a20fcaf634982b513116b99dbc34aa74525b3efaf48e5a5d4349a66b" exitCode=0 Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.588974 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerDied","Data":"b4890983a20fcaf634982b513116b99dbc34aa74525b3efaf48e5a5d4349a66b"} Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.616366 4760 scope.go:117] "RemoveContainer" containerID="b7e276e77618243cc3d581a472a7fe921ea0c8c0701214c2c24ce96d9d6c4dd9" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.651554 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.675311 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.695319 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:55 crc kubenswrapper[4760]: E1204 12:38:55.696157 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="proxy-httpd" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696190 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="proxy-httpd" Dec 04 12:38:55 crc kubenswrapper[4760]: E1204 12:38:55.696270 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="sg-core" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696279 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="sg-core" Dec 04 12:38:55 crc kubenswrapper[4760]: E1204 12:38:55.696292 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-notification-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696301 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-notification-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: E1204 12:38:55.696339 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-central-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696350 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-central-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696606 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-notification-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696635 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="ceilometer-central-agent" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696655 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="proxy-httpd" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.696668 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" containerName="sg-core" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.705256 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.710183 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.732656 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.733273 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.792891 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.792995 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.793048 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.793068 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.793167 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.793253 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.793277 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.848398 4760 scope.go:117] "RemoveContainer" containerID="7e9f6e8dcedf5ef24ca9231cd09e9bb54318c296e1ab2b58b8ce7560c92d5cd9" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.885634 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895068 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895360 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895529 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895642 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895740 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895824 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.895959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.896740 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.896897 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.920523 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.924890 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.931813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.932002 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9062b0c5-1053-4a92-a4d1-0830846e5e7f" path="/var/lib/kubelet/pods/9062b0c5-1053-4a92-a4d1-0830846e5e7f/volumes" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.922047 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.937815 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " pod="openstack/ceilometer-0" Dec 04 12:38:55 crc kubenswrapper[4760]: I1204 12:38:55.960602 4760 scope.go:117] "RemoveContainer" containerID="0f1949c984123609dea6fc2dc1352f99007579d0af8ff58ef99997830f6e72c7" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.002939 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004197 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9r6x\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004312 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004343 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004419 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004519 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004586 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004716 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.004780 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph\") pod \"ead32194-7c87-4c05-99b6-55a928499e0d\" (UID: \"ead32194-7c87-4c05-99b6-55a928499e0d\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.007650 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.009018 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs" (OuterVolumeSpecName: "logs") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.036856 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph" (OuterVolumeSpecName: "ceph") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.036987 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts" (OuterVolumeSpecName: "scripts") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.046576 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.047429 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x" (OuterVolumeSpecName: "kube-api-access-h9r6x") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "kube-api-access-h9r6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.073603 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118806 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118855 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118868 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9r6x\" (UniqueName: \"kubernetes.io/projected/ead32194-7c87-4c05-99b6-55a928499e0d-kube-api-access-h9r6x\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118879 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118889 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ead32194-7c87-4c05-99b6-55a928499e0d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118896 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.118905 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.126439 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.158824 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.164332 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data" (OuterVolumeSpecName: "config-data") pod "ead32194-7c87-4c05-99b6-55a928499e0d" (UID: "ead32194-7c87-4c05-99b6-55a928499e0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.173472 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.224066 4760 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.224110 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead32194-7c87-4c05-99b6-55a928499e0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.224125 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.229875 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.326313 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7npv\" (UniqueName: \"kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv\") pod \"36730972-a8b8-4dd7-8335-805b4b694e42\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.326485 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts\") pod \"36730972-a8b8-4dd7-8335-805b4b694e42\" (UID: \"36730972-a8b8-4dd7-8335-805b4b694e42\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.330326 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36730972-a8b8-4dd7-8335-805b4b694e42" (UID: "36730972-a8b8-4dd7-8335-805b4b694e42"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.339449 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv" (OuterVolumeSpecName: "kube-api-access-v7npv") pod "36730972-a8b8-4dd7-8335-805b4b694e42" (UID: "36730972-a8b8-4dd7-8335-805b4b694e42"). InnerVolumeSpecName "kube-api-access-v7npv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.430121 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7npv\" (UniqueName: \"kubernetes.io/projected/36730972-a8b8-4dd7-8335-805b4b694e42-kube-api-access-v7npv\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.430270 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36730972-a8b8-4dd7-8335-805b4b694e42-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.618741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6c68-account-create-update-96z8z" event={"ID":"87816d08-ea62-428d-a43e-40b3e030afb5","Type":"ContainerDied","Data":"f488217e675d0da102515e7131a14dc853acc257883e76721bd858bf4fa86a10"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.618816 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f488217e675d0da102515e7131a14dc853acc257883e76721bd858bf4fa86a10" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.628551 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7vhlm" event={"ID":"36730972-a8b8-4dd7-8335-805b4b694e42","Type":"ContainerDied","Data":"acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.628615 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acacae4a5308282cdd4f2aef1a8452ccc966c1aba1c3517e10245611dcc9031e" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.628645 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7vhlm" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.633688 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" event={"ID":"9e0b6f32-b982-4267-a0b8-b977b91f187c","Type":"ContainerDied","Data":"764543efe2a5a7358f4331feb4e918771d38c2668616f3e0c6766b1415c83cec"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.633731 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="764543efe2a5a7358f4331feb4e918771d38c2668616f3e0c6766b1415c83cec" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.644999 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ead32194-7c87-4c05-99b6-55a928499e0d","Type":"ContainerDied","Data":"5c15c9d697119aa5f59cd5f28946c8ab15964ac1d312186c94f6f322fbe1c040"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.645071 4760 scope.go:117] "RemoveContainer" containerID="b4890983a20fcaf634982b513116b99dbc34aa74525b3efaf48e5a5d4349a66b" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.645090 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.666960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" event={"ID":"b417eac6-2ecb-42b0-a9ad-23860eaefde3","Type":"ContainerDied","Data":"53d5c36d0b2cb74c5a1081fd0bf0b59e75daea5dde5dc34a53884a07209ea211"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.667022 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d5c36d0b2cb74c5a1081fd0bf0b59e75daea5dde5dc34a53884a07209ea211" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.670294 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nszzz" event={"ID":"9053d7ed-6d5e-44fe-ac2d-17ee4719a590","Type":"ContainerDied","Data":"4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.670364 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5ea3e1699e52b08714f28294597d170525c75a9bc15252bf3518442c6e2d93" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.677538 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5f284" event={"ID":"3b4fdc4d-3353-4e41-8218-282fef7f1418","Type":"ContainerDied","Data":"8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.677597 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8768deff5edf17dfaf3e55bbb57880f36b8c90377c56f7a1c06f9acd075b6589" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.680820 4760 generic.go:334] "Generic (PLEG): container finished" podID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerID="8e52f36f78c1cf2755bdaa578909f50d145f6968d91b7e131117e2b9c9273634" exitCode=1 Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.680871 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerDied","Data":"8e52f36f78c1cf2755bdaa578909f50d145f6968d91b7e131117e2b9c9273634"} Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.732893 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.735003 4760 scope.go:117] "RemoveContainer" containerID="687679c5e5a0e7a203ada44ea9f444017276f4c4fbac1d624bf04a251f7494e0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.738039 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.761941 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.769473 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.788830 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.804872 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.818339 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.851613 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ws6\" (UniqueName: \"kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6\") pod \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.851815 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts\") pod \"87816d08-ea62-428d-a43e-40b3e030afb5\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852149 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts\") pod \"9e0b6f32-b982-4267-a0b8-b977b91f187c\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852194 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfp9\" (UniqueName: \"kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9\") pod \"87816d08-ea62-428d-a43e-40b3e030afb5\" (UID: \"87816d08-ea62-428d-a43e-40b3e030afb5\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852262 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts\") pod \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\" (UID: \"9053d7ed-6d5e-44fe-ac2d-17ee4719a590\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852296 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nfqc\" (UniqueName: \"kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc\") pod \"3b4fdc4d-3353-4e41-8218-282fef7f1418\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852385 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts\") pod \"3b4fdc4d-3353-4e41-8218-282fef7f1418\" (UID: \"3b4fdc4d-3353-4e41-8218-282fef7f1418\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.852439 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6gzf\" (UniqueName: \"kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf\") pod \"9e0b6f32-b982-4267-a0b8-b977b91f187c\" (UID: \"9e0b6f32-b982-4267-a0b8-b977b91f187c\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.853253 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e0b6f32-b982-4267-a0b8-b977b91f187c" (UID: "9e0b6f32-b982-4267-a0b8-b977b91f187c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.853286 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9053d7ed-6d5e-44fe-ac2d-17ee4719a590" (UID: "9053d7ed-6d5e-44fe-ac2d-17ee4719a590"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.861552 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87816d08-ea62-428d-a43e-40b3e030afb5" (UID: "87816d08-ea62-428d-a43e-40b3e030afb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.870985 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b4fdc4d-3353-4e41-8218-282fef7f1418" (UID: "3b4fdc4d-3353-4e41-8218-282fef7f1418"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.874615 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875109 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36730972-a8b8-4dd7-8335-805b4b694e42" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875124 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="36730972-a8b8-4dd7-8335-805b4b694e42" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875152 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-httpd" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875160 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-httpd" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875178 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b417eac6-2ecb-42b0-a9ad-23860eaefde3" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875187 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b417eac6-2ecb-42b0-a9ad-23860eaefde3" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875225 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0b6f32-b982-4267-a0b8-b977b91f187c" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875233 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0b6f32-b982-4267-a0b8-b977b91f187c" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875248 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9053d7ed-6d5e-44fe-ac2d-17ee4719a590" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875256 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9053d7ed-6d5e-44fe-ac2d-17ee4719a590" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875270 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87816d08-ea62-428d-a43e-40b3e030afb5" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875277 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="87816d08-ea62-428d-a43e-40b3e030afb5" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875305 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4fdc4d-3353-4e41-8218-282fef7f1418" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875314 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4fdc4d-3353-4e41-8218-282fef7f1418" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: E1204 12:38:56.875324 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-log" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875332 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-log" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875557 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4fdc4d-3353-4e41-8218-282fef7f1418" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875571 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="87816d08-ea62-428d-a43e-40b3e030afb5" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875587 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="36730972-a8b8-4dd7-8335-805b4b694e42" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875604 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-httpd" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875613 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b417eac6-2ecb-42b0-a9ad-23860eaefde3" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875625 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9053d7ed-6d5e-44fe-ac2d-17ee4719a590" containerName="mariadb-database-create" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875634 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" containerName="glance-log" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.875642 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0b6f32-b982-4267-a0b8-b977b91f187c" containerName="mariadb-account-create-update" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.877974 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.878831 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9" (OuterVolumeSpecName: "kube-api-access-6zfp9") pod "87816d08-ea62-428d-a43e-40b3e030afb5" (UID: "87816d08-ea62-428d-a43e-40b3e030afb5"). InnerVolumeSpecName "kube-api-access-6zfp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.879583 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6" (OuterVolumeSpecName: "kube-api-access-28ws6") pod "9053d7ed-6d5e-44fe-ac2d-17ee4719a590" (UID: "9053d7ed-6d5e-44fe-ac2d-17ee4719a590"). InnerVolumeSpecName "kube-api-access-28ws6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.881010 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf" (OuterVolumeSpecName: "kube-api-access-d6gzf") pod "9e0b6f32-b982-4267-a0b8-b977b91f187c" (UID: "9e0b6f32-b982-4267-a0b8-b977b91f187c"). InnerVolumeSpecName "kube-api-access-d6gzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.884564 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.885071 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.885896 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc" (OuterVolumeSpecName: "kube-api-access-7nfqc") pod "3b4fdc4d-3353-4e41-8218-282fef7f1418" (UID: "3b4fdc4d-3353-4e41-8218-282fef7f1418"). InnerVolumeSpecName "kube-api-access-7nfqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.896809 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.960316 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts\") pod \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.960741 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct\") pod \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\" (UID: \"b417eac6-2ecb-42b0-a9ad-23860eaefde3\") " Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961279 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961374 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961408 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961458 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961529 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-logs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961591 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-ceph\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961674 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961709 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zxj\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-kube-api-access-84zxj\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961839 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961950 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4fdc4d-3353-4e41-8218-282fef7f1418-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961969 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6gzf\" (UniqueName: \"kubernetes.io/projected/9e0b6f32-b982-4267-a0b8-b977b91f187c-kube-api-access-d6gzf\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961986 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ws6\" (UniqueName: \"kubernetes.io/projected/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-kube-api-access-28ws6\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.961981 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b417eac6-2ecb-42b0-a9ad-23860eaefde3" (UID: "b417eac6-2ecb-42b0-a9ad-23860eaefde3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.962000 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87816d08-ea62-428d-a43e-40b3e030afb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.962081 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e0b6f32-b982-4267-a0b8-b977b91f187c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.962099 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfp9\" (UniqueName: \"kubernetes.io/projected/87816d08-ea62-428d-a43e-40b3e030afb5-kube-api-access-6zfp9\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.962115 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9053d7ed-6d5e-44fe-ac2d-17ee4719a590-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.962130 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nfqc\" (UniqueName: \"kubernetes.io/projected/3b4fdc4d-3353-4e41-8218-282fef7f1418-kube-api-access-7nfqc\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:56 crc kubenswrapper[4760]: I1204 12:38:56.969302 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct" (OuterVolumeSpecName: "kube-api-access-7jpct") pod "b417eac6-2ecb-42b0-a9ad-23860eaefde3" (UID: "b417eac6-2ecb-42b0-a9ad-23860eaefde3"). InnerVolumeSpecName "kube-api-access-7jpct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.028617 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.063941 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064027 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064091 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064118 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064155 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064235 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-logs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064294 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-ceph\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064398 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064434 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zxj\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-kube-api-access-84zxj\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064520 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/b417eac6-2ecb-42b0-a9ad-23860eaefde3-kube-api-access-7jpct\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.064536 4760 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b417eac6-2ecb-42b0-a9ad-23860eaefde3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.065306 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-logs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.067581 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa001053-23fb-4a80-8f36-8efc97cdc04d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.068647 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.075936 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-scripts\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.076077 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.079748 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-ceph\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.080519 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.087322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa001053-23fb-4a80-8f36-8efc97cdc04d-config-data\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.100578 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zxj\" (UniqueName: \"kubernetes.io/projected/fa001053-23fb-4a80-8f36-8efc97cdc04d-kube-api-access-84zxj\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.135982 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"fa001053-23fb-4a80-8f36-8efc97cdc04d\") " pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165662 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165737 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165773 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165913 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165890 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165968 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrfxk\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.165980 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.166019 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.166619 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.166703 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom\") pod \"d4cd7036-99ee-48dc-8df1-63c34f54087b\" (UID: \"d4cd7036-99ee-48dc-8df1-63c34f54087b\") " Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.167492 4760 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.168006 4760 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/d4cd7036-99ee-48dc-8df1-63c34f54087b-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.171038 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts" (OuterVolumeSpecName: "scripts") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.172715 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk" (OuterVolumeSpecName: "kube-api-access-rrfxk") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "kube-api-access-rrfxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.172967 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.174470 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph" (OuterVolumeSpecName: "ceph") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.234190 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.260198 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.270067 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.270105 4760 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.270117 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.270126 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrfxk\" (UniqueName: \"kubernetes.io/projected/d4cd7036-99ee-48dc-8df1-63c34f54087b-kube-api-access-rrfxk\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.270137 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.313959 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.344575 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data" (OuterVolumeSpecName: "config-data") pod "d4cd7036-99ee-48dc-8df1-63c34f54087b" (UID: "d4cd7036-99ee-48dc-8df1-63c34f54087b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.372862 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4cd7036-99ee-48dc-8df1-63c34f54087b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.713758 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"d4cd7036-99ee-48dc-8df1-63c34f54087b","Type":"ContainerDied","Data":"47d31f3c802336c79b29ff4343e965c9619fd737346aac6fd6babeed7986065d"} Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.713798 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.713836 4760 scope.go:117] "RemoveContainer" containerID="d09b3d7691c24a637c42570a3d8c74239e3039c6a5e79e374abc2403489c3891" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.732487 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nszzz" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.739646 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6c68-account-create-update-96z8z" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.742831 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerStarted","Data":"7fee5b77e46bad2d2c68c17adcaa01b900b53a828e049ecc1dccf97b2e021b92"} Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.742958 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9784-account-create-update-vw2jz" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.743340 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9fc7-account-create-update-6ctmc" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.745530 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5f284" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.759515 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.796853 4760 scope.go:117] "RemoveContainer" containerID="8e52f36f78c1cf2755bdaa578909f50d145f6968d91b7e131117e2b9c9273634" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.797398 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.820331 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:57 crc kubenswrapper[4760]: E1204 12:38:57.826016 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="probe" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.826066 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="probe" Dec 04 12:38:57 crc kubenswrapper[4760]: E1204 12:38:57.826117 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="manila-share" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.826128 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="manila-share" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.826811 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="probe" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.826846 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" containerName="manila-share" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.829145 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.833246 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.836191 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903394 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903495 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903712 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903864 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd5v\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-kube-api-access-gmd5v\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903927 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.903983 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-scripts\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:57 crc kubenswrapper[4760]: I1204 12:38:57.904021 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-ceph\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.034695 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.042589 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd5v\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-kube-api-access-gmd5v\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.042953 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.043083 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-scripts\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.043259 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-ceph\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.043587 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.054815 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.054955 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.055712 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.055782 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/546d61e1-ffcb-48a3-8dae-929470ae8372-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.075444 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.075619 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-scripts\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.079162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.131260 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546d61e1-ffcb-48a3-8dae-929470ae8372-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.146968 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-ceph\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.170084 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd5v\" (UniqueName: \"kubernetes.io/projected/546d61e1-ffcb-48a3-8dae-929470ae8372-kube-api-access-gmd5v\") pod \"manila-share-share1-0\" (UID: \"546d61e1-ffcb-48a3-8dae-929470ae8372\") " pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.316686 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cd7036-99ee-48dc-8df1-63c34f54087b" path="/var/lib/kubelet/pods/d4cd7036-99ee-48dc-8df1-63c34f54087b/volumes" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.317930 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead32194-7c87-4c05-99b6-55a928499e0d" path="/var/lib/kubelet/pods/ead32194-7c87-4c05-99b6-55a928499e0d/volumes" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.326810 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.326878 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.400319 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 12:38:58 crc kubenswrapper[4760]: I1204 12:38:58.751810 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa001053-23fb-4a80-8f36-8efc97cdc04d","Type":"ContainerStarted","Data":"8a6cad42f96ba5cb19a0e9d9b3b056002f2ad82fbdfaf588b0933ab34f9d2e3d"} Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.079890 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 12:38:59 crc kubenswrapper[4760]: W1204 12:38:59.087173 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod546d61e1_ffcb_48a3_8dae_929470ae8372.slice/crio-65e7ef89f66aec97f08b2e7353cc5bc19b7b21e025e3469b3defaf127e614fb6 WatchSource:0}: Error finding container 65e7ef89f66aec97f08b2e7353cc5bc19b7b21e025e3469b3defaf127e614fb6: Status 404 returned error can't find the container with id 65e7ef89f66aec97f08b2e7353cc5bc19b7b21e025e3469b3defaf127e614fb6 Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.428317 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.428936 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-log" containerID="cri-o://397a43da3613c2b526c626e5e6731e03259fc8d31bd1daac41d3f811bb234537" gracePeriod=30 Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.429672 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-httpd" containerID="cri-o://0633f872cc5171c179e3df911714690342d38a84f061229b5c5387033035ac80" gracePeriod=30 Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.806920 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa001053-23fb-4a80-8f36-8efc97cdc04d","Type":"ContainerStarted","Data":"8d65b1510915f106b8fcf713f72c06436ee804c0daac84ee68f17ed6a5fd5795"} Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.814002 4760 generic.go:334] "Generic (PLEG): container finished" podID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerID="397a43da3613c2b526c626e5e6731e03259fc8d31bd1daac41d3f811bb234537" exitCode=143 Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.814117 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerDied","Data":"397a43da3613c2b526c626e5e6731e03259fc8d31bd1daac41d3f811bb234537"} Dec 04 12:38:59 crc kubenswrapper[4760]: I1204 12:38:59.817159 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"546d61e1-ffcb-48a3-8dae-929470ae8372","Type":"ContainerStarted","Data":"65e7ef89f66aec97f08b2e7353cc5bc19b7b21e025e3469b3defaf127e614fb6"} Dec 04 12:39:00 crc kubenswrapper[4760]: I1204 12:39:00.834156 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"546d61e1-ffcb-48a3-8dae-929470ae8372","Type":"ContainerStarted","Data":"521bb3c91d4ce440dcff66c6d242cd2d497f3ea74c87ea771256c98ea0da7247"} Dec 04 12:39:00 crc kubenswrapper[4760]: I1204 12:39:00.839445 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2d73360c-cea6-4c66-88fc-554bda882906" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.177:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:39:00 crc kubenswrapper[4760]: I1204 12:39:00.843416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fa001053-23fb-4a80-8f36-8efc97cdc04d","Type":"ContainerStarted","Data":"7869935700a30072fe19a62a8692e743efd52d43201fc42f6b3566a253d6354e"} Dec 04 12:39:00 crc kubenswrapper[4760]: I1204 12:39:00.848932 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerStarted","Data":"e1cd34c714dd9abe34542df33c12b6621bf3d46bf1611d48fc31b83dfccbcb5e"} Dec 04 12:39:00 crc kubenswrapper[4760]: I1204 12:39:00.867388 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.86735259 podStartE2EDuration="4.86735259s" podCreationTimestamp="2025-12-04 12:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:39:00.862253497 +0000 UTC m=+1543.903700084" watchObservedRunningTime="2025-12-04 12:39:00.86735259 +0000 UTC m=+1543.908799157" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.543225 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jpsv5"] Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.549051 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.553776 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bpqcl" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.554057 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.554201 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.566821 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jpsv5"] Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.577211 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpk62\" (UniqueName: \"kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.577324 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.577467 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.577537 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.681957 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpk62\" (UniqueName: \"kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.686033 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.687629 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.687804 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.696761 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.718136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpk62\" (UniqueName: \"kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.720066 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.744493 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts\") pod \"nova-cell0-conductor-db-sync-jpsv5\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.880793 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"546d61e1-ffcb-48a3-8dae-929470ae8372","Type":"ContainerStarted","Data":"20b4249396b91598686fd382b2ed48592e68ba6b12a7530a02332119548cafdc"} Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.880868 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerStarted","Data":"574f222c7ce40fd6146c074338e6aececaca90c4276f9fad6197b3b005c0c145"} Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.902501 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.9024694669999995 podStartE2EDuration="4.902469467s" podCreationTimestamp="2025-12-04 12:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:39:01.892137479 +0000 UTC m=+1544.933584036" watchObservedRunningTime="2025-12-04 12:39:01.902469467 +0000 UTC m=+1544.943916034" Dec 04 12:39:01 crc kubenswrapper[4760]: I1204 12:39:01.926734 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:02 crc kubenswrapper[4760]: I1204 12:39:02.493258 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jpsv5"] Dec 04 12:39:02 crc kubenswrapper[4760]: I1204 12:39:02.909212 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" event={"ID":"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56","Type":"ContainerStarted","Data":"27b0a49e4f7b06d40141ca82bbc4beee0e5790157b936c6d1d9bf7138af63959"} Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.079942 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.917968 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.936509 4760 generic.go:334] "Generic (PLEG): container finished" podID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerID="0633f872cc5171c179e3df911714690342d38a84f061229b5c5387033035ac80" exitCode=0 Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.936712 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerDied","Data":"0633f872cc5171c179e3df911714690342d38a84f061229b5c5387033035ac80"} Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.936950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16fa8aaa-4421-48a8-8b79-9d0780e6d04a","Type":"ContainerDied","Data":"bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8"} Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.936974 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb7359e8379504ff26120a6ee0aa3ece38739f873b67a9bd16a3b9f3176c8ac8" Dec 04 12:39:03 crc kubenswrapper[4760]: I1204 12:39:03.950472 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerStarted","Data":"c0cef0ba5d9b5eaa51b35f44405dfe410116ffd95fb011638e0921894b2010c1"} Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.007115 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.046809 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2frdn\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.046967 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.047072 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.047656 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.047783 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs" (OuterVolumeSpecName: "logs") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.047938 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.048413 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.048449 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.048497 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.048526 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.048564 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph\") pod \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\" (UID: \"16fa8aaa-4421-48a8-8b79-9d0780e6d04a\") " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.049174 4760 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.049194 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.060078 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.066437 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts" (OuterVolumeSpecName: "scripts") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.080583 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph" (OuterVolumeSpecName: "ceph") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.117128 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn" (OuterVolumeSpecName: "kube-api-access-2frdn") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "kube-api-access-2frdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.118542 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.151591 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.151640 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.151650 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.151659 4760 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.151672 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2frdn\" (UniqueName: \"kubernetes.io/projected/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-kube-api-access-2frdn\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.179483 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.195623 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.265176 4760 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.265294 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.293534 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data" (OuterVolumeSpecName: "config-data") pod "16fa8aaa-4421-48a8-8b79-9d0780e6d04a" (UID: "16fa8aaa-4421-48a8-8b79-9d0780e6d04a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.366763 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16fa8aaa-4421-48a8-8b79-9d0780e6d04a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.967750 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.968024 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-notification-agent" containerID="cri-o://574f222c7ce40fd6146c074338e6aececaca90c4276f9fad6197b3b005c0c145" gracePeriod=30 Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.968060 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="proxy-httpd" containerID="cri-o://b6cbb9b6d772d206cb0fe0e21723b0005b6a75386f66cd367702deda6204f031" gracePeriod=30 Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.967762 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerStarted","Data":"b6cbb9b6d772d206cb0fe0e21723b0005b6a75386f66cd367702deda6204f031"} Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.968178 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="sg-core" containerID="cri-o://c0cef0ba5d9b5eaa51b35f44405dfe410116ffd95fb011638e0921894b2010c1" gracePeriod=30 Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.968008 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-central-agent" containerID="cri-o://e1cd34c714dd9abe34542df33c12b6621bf3d46bf1611d48fc31b83dfccbcb5e" gracePeriod=30 Dec 04 12:39:04 crc kubenswrapper[4760]: I1204 12:39:04.968549 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.014430 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.115782255 podStartE2EDuration="10.014397389s" podCreationTimestamp="2025-12-04 12:38:55 +0000 UTC" firstStartedPulling="2025-12-04 12:38:57.237198076 +0000 UTC m=+1540.278644643" lastFinishedPulling="2025-12-04 12:39:04.13581321 +0000 UTC m=+1547.177259777" observedRunningTime="2025-12-04 12:39:05.002911614 +0000 UTC m=+1548.044358181" watchObservedRunningTime="2025-12-04 12:39:05.014397389 +0000 UTC m=+1548.055843956" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.046309 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.062241 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.083051 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:39:05 crc kubenswrapper[4760]: E1204 12:39:05.083834 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-log" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.083855 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-log" Dec 04 12:39:05 crc kubenswrapper[4760]: E1204 12:39:05.083898 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-httpd" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.083908 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-httpd" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.084193 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-log" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.084238 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" containerName="glance-httpd" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.086034 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.091500 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.102570 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.119854 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.193372 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.193459 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgz47\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-kube-api-access-hgz47\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.193519 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.193632 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.193847 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.194271 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.194380 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.194505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.194551 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297088 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297175 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297285 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297426 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297468 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297545 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297607 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.297638 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgz47\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-kube-api-access-hgz47\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.298095 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.298386 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.299396 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/034481e8-9000-4642-8f09-01e015db2de2-logs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.305948 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-ceph\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.306118 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.317354 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.318805 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.324540 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgz47\" (UniqueName: \"kubernetes.io/projected/034481e8-9000-4642-8f09-01e015db2de2-kube-api-access-hgz47\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.325177 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/034481e8-9000-4642-8f09-01e015db2de2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.374914 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"034481e8-9000-4642-8f09-01e015db2de2\") " pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.475358 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:05 crc kubenswrapper[4760]: I1204 12:39:05.884997 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fa8aaa-4421-48a8-8b79-9d0780e6d04a" path="/var/lib/kubelet/pods/16fa8aaa-4421-48a8-8b79-9d0780e6d04a/volumes" Dec 04 12:39:06 crc kubenswrapper[4760]: I1204 12:39:06.026437 4760 generic.go:334] "Generic (PLEG): container finished" podID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerID="b6cbb9b6d772d206cb0fe0e21723b0005b6a75386f66cd367702deda6204f031" exitCode=0 Dec 04 12:39:06 crc kubenswrapper[4760]: I1204 12:39:06.027576 4760 generic.go:334] "Generic (PLEG): container finished" podID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerID="c0cef0ba5d9b5eaa51b35f44405dfe410116ffd95fb011638e0921894b2010c1" exitCode=2 Dec 04 12:39:06 crc kubenswrapper[4760]: I1204 12:39:06.027532 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerDied","Data":"b6cbb9b6d772d206cb0fe0e21723b0005b6a75386f66cd367702deda6204f031"} Dec 04 12:39:06 crc kubenswrapper[4760]: I1204 12:39:06.027775 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerDied","Data":"c0cef0ba5d9b5eaa51b35f44405dfe410116ffd95fb011638e0921894b2010c1"} Dec 04 12:39:06 crc kubenswrapper[4760]: I1204 12:39:06.381624 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.072698 4760 generic.go:334] "Generic (PLEG): container finished" podID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerID="574f222c7ce40fd6146c074338e6aececaca90c4276f9fad6197b3b005c0c145" exitCode=0 Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.072817 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerDied","Data":"574f222c7ce40fd6146c074338e6aececaca90c4276f9fad6197b3b005c0c145"} Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.076594 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"034481e8-9000-4642-8f09-01e015db2de2","Type":"ContainerStarted","Data":"5631d4babb9e203d15589ddcbcde01942999bfd71b7a8451e27972970b7d94a3"} Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.315783 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.316195 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.443012 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 12:39:07 crc kubenswrapper[4760]: I1204 12:39:07.460090 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.100765 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"034481e8-9000-4642-8f09-01e015db2de2","Type":"ContainerStarted","Data":"84fd2d3d246cb47360f9fa331309cc7d5c01588f0d7dd88dfc1bb359bc040573"} Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.101470 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"034481e8-9000-4642-8f09-01e015db2de2","Type":"ContainerStarted","Data":"325a48c1713601a48ed9f93c49f6dce4d8a334021185bb87e9979f83a2b875a5"} Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.101888 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.101965 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.144262 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.1442154589999998 podStartE2EDuration="3.144215459s" podCreationTimestamp="2025-12-04 12:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:39:08.132275711 +0000 UTC m=+1551.173722278" watchObservedRunningTime="2025-12-04 12:39:08.144215459 +0000 UTC m=+1551.185662026" Dec 04 12:39:08 crc kubenswrapper[4760]: I1204 12:39:08.403517 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 12:39:10 crc kubenswrapper[4760]: I1204 12:39:10.124775 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:39:10 crc kubenswrapper[4760]: I1204 12:39:10.125144 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:39:10 crc kubenswrapper[4760]: I1204 12:39:10.945981 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 12:39:11 crc kubenswrapper[4760]: I1204 12:39:11.143690 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:39:11 crc kubenswrapper[4760]: I1204 12:39:11.183885 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 12:39:12 crc kubenswrapper[4760]: I1204 12:39:12.170346 4760 generic.go:334] "Generic (PLEG): container finished" podID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerID="e1cd34c714dd9abe34542df33c12b6621bf3d46bf1611d48fc31b83dfccbcb5e" exitCode=0 Dec 04 12:39:12 crc kubenswrapper[4760]: I1204 12:39:12.171791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerDied","Data":"e1cd34c714dd9abe34542df33c12b6621bf3d46bf1611d48fc31b83dfccbcb5e"} Dec 04 12:39:14 crc kubenswrapper[4760]: I1204 12:39:14.997862 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170194 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170587 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170637 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170775 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170850 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170880 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.170902 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd\") pod \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\" (UID: \"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58\") " Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.172002 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.172031 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.176107 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts" (OuterVolumeSpecName: "scripts") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.176932 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25" (OuterVolumeSpecName: "kube-api-access-c5q25") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "kube-api-access-c5q25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.204358 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.239554 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5f2bd7ad-1f91-4cba-8202-aad5e4f18c58","Type":"ContainerDied","Data":"7fee5b77e46bad2d2c68c17adcaa01b900b53a828e049ecc1dccf97b2e021b92"} Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.239638 4760 scope.go:117] "RemoveContainer" containerID="b6cbb9b6d772d206cb0fe0e21723b0005b6a75386f66cd367702deda6204f031" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.239657 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.244504 4760 generic.go:334] "Generic (PLEG): container finished" podID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerID="6a21d1cb04b2f014d317db98e10c64b9c59da5c0cf7192d9412a2d0ed4984f79" exitCode=137 Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.244596 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerDied","Data":"6a21d1cb04b2f014d317db98e10c64b9c59da5c0cf7192d9412a2d0ed4984f79"} Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.244646 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7fc6c944-sh7tv" event={"ID":"a6452e5d-5eb7-4d21-96ea-eefbc327f2f5","Type":"ContainerStarted","Data":"704ceafeef03a2c2122b5f179d6ce112b07e9152bcec3b453bab75db5f429ed3"} Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.268844 4760 scope.go:117] "RemoveContainer" containerID="c0cef0ba5d9b5eaa51b35f44405dfe410116ffd95fb011638e0921894b2010c1" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.276743 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-kube-api-access-c5q25\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.276781 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.276793 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.276802 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.276813 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.296902 4760 scope.go:117] "RemoveContainer" containerID="574f222c7ce40fd6146c074338e6aececaca90c4276f9fad6197b3b005c0c145" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.317398 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.323487 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data" (OuterVolumeSpecName: "config-data") pod "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" (UID: "5f2bd7ad-1f91-4cba-8202-aad5e4f18c58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.324562 4760 scope.go:117] "RemoveContainer" containerID="e1cd34c714dd9abe34542df33c12b6621bf3d46bf1611d48fc31b83dfccbcb5e" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.347463 4760 scope.go:117] "RemoveContainer" containerID="7754d6e6486a7c466fd3b1bfb32b05bda479f30a77fc71e23d9cfe03f99ca18c" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.380371 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.380521 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.477724 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.482474 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.554964 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.563019 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.633891 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.650900 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.676386 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:15 crc kubenswrapper[4760]: E1204 12:39:15.677103 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="proxy-httpd" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677130 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="proxy-httpd" Dec 04 12:39:15 crc kubenswrapper[4760]: E1204 12:39:15.677161 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-central-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677170 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-central-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: E1204 12:39:15.677202 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-notification-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677229 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-notification-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: E1204 12:39:15.677248 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="sg-core" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677257 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="sg-core" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677540 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="proxy-httpd" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677576 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="sg-core" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677594 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-central-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.677613 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" containerName="ceilometer-notification-agent" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.679716 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.683640 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.684857 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.686330 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.799999 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.800467 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.800621 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.800760 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.801190 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.801465 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnl54\" (UniqueName: \"kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.801600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.879433 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2bd7ad-1f91-4cba-8202-aad5e4f18c58" path="/var/lib/kubelet/pods/5f2bd7ad-1f91-4cba-8202-aad5e4f18c58/volumes" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.904556 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnl54\" (UniqueName: \"kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.904730 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.904872 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.905010 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.905096 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.905202 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.905379 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.906435 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.906607 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.912469 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.912783 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.913541 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.914036 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:15 crc kubenswrapper[4760]: I1204 12:39:15.933953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnl54\" (UniqueName: \"kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54\") pod \"ceilometer-0\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " pod="openstack/ceilometer-0" Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.000538 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.273311 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" event={"ID":"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56","Type":"ContainerStarted","Data":"a480274240e5ff5d085247653af7f226604c60b910d2f6af1562ab79ff541799"} Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.273896 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.273937 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.297731 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" podStartSLOduration=3.004616415 podStartE2EDuration="15.297693256s" podCreationTimestamp="2025-12-04 12:39:01 +0000 UTC" firstStartedPulling="2025-12-04 12:39:02.531818694 +0000 UTC m=+1545.573265261" lastFinishedPulling="2025-12-04 12:39:14.824895535 +0000 UTC m=+1557.866342102" observedRunningTime="2025-12-04 12:39:16.297616373 +0000 UTC m=+1559.339062940" watchObservedRunningTime="2025-12-04 12:39:16.297693256 +0000 UTC m=+1559.339139823" Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.555920 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:16 crc kubenswrapper[4760]: I1204 12:39:16.877822 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:39:17 crc kubenswrapper[4760]: I1204 12:39:17.296361 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerStarted","Data":"be4aa54ef8360de4120e9da5136ad3808b7643886f2696263449490b8351824b"} Dec 04 12:39:18 crc kubenswrapper[4760]: I1204 12:39:18.315077 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerStarted","Data":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} Dec 04 12:39:18 crc kubenswrapper[4760]: I1204 12:39:18.315770 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerStarted","Data":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} Dec 04 12:39:19 crc kubenswrapper[4760]: I1204 12:39:19.090462 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:19 crc kubenswrapper[4760]: I1204 12:39:19.090943 4760 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 12:39:19 crc kubenswrapper[4760]: I1204 12:39:19.219182 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 12:39:19 crc kubenswrapper[4760]: I1204 12:39:19.368772 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:39:21 crc kubenswrapper[4760]: I1204 12:39:21.371913 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 12:39:22 crc kubenswrapper[4760]: I1204 12:39:22.367805 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerStarted","Data":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} Dec 04 12:39:24 crc kubenswrapper[4760]: I1204 12:39:24.174911 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:39:24 crc kubenswrapper[4760]: I1204 12:39:24.175249 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:39:24 crc kubenswrapper[4760]: I1204 12:39:24.177573 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:39:29 crc kubenswrapper[4760]: I1204 12:39:29.481582 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerStarted","Data":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} Dec 04 12:39:29 crc kubenswrapper[4760]: I1204 12:39:29.482467 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:39:29 crc kubenswrapper[4760]: I1204 12:39:29.507756 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.715230343 podStartE2EDuration="14.507710462s" podCreationTimestamp="2025-12-04 12:39:15 +0000 UTC" firstStartedPulling="2025-12-04 12:39:16.55487771 +0000 UTC m=+1559.596324277" lastFinishedPulling="2025-12-04 12:39:28.347357819 +0000 UTC m=+1571.388804396" observedRunningTime="2025-12-04 12:39:29.506578986 +0000 UTC m=+1572.548025553" watchObservedRunningTime="2025-12-04 12:39:29.507710462 +0000 UTC m=+1572.549157029" Dec 04 12:39:33 crc kubenswrapper[4760]: I1204 12:39:33.380221 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:39:33 crc kubenswrapper[4760]: I1204 12:39:33.380796 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:39:34 crc kubenswrapper[4760]: I1204 12:39:34.175990 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b7fc6c944-sh7tv" podUID="a6452e5d-5eb7-4d21-96ea-eefbc327f2f5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 04 12:39:37 crc kubenswrapper[4760]: I1204 12:39:37.498101 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:37 crc kubenswrapper[4760]: I1204 12:39:37.499260 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="sg-core" containerID="cri-o://58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" gracePeriod=30 Dec 04 12:39:37 crc kubenswrapper[4760]: I1204 12:39:37.499356 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-notification-agent" containerID="cri-o://996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" gracePeriod=30 Dec 04 12:39:37 crc kubenswrapper[4760]: I1204 12:39:37.499308 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="proxy-httpd" containerID="cri-o://10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" gracePeriod=30 Dec 04 12:39:37 crc kubenswrapper[4760]: I1204 12:39:37.499177 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-central-agent" containerID="cri-o://d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" gracePeriod=30 Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.583506 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593790 4760 generic.go:334] "Generic (PLEG): container finished" podID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" exitCode=0 Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593837 4760 generic.go:334] "Generic (PLEG): container finished" podID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" exitCode=2 Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593847 4760 generic.go:334] "Generic (PLEG): container finished" podID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" exitCode=0 Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593855 4760 generic.go:334] "Generic (PLEG): container finished" podID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" exitCode=0 Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593884 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.593884 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerDied","Data":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.594736 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerDied","Data":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.594757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerDied","Data":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.594771 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerDied","Data":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.594788 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"800116ed-2c40-41e3-a141-c2a37c62de4f","Type":"ContainerDied","Data":"be4aa54ef8360de4120e9da5136ad3808b7643886f2696263449490b8351824b"} Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.594815 4760 scope.go:117] "RemoveContainer" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.613275 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.617793 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.617889 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.617949 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.618002 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.618180 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnl54\" (UniqueName: \"kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.618233 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts\") pod \"800116ed-2c40-41e3-a141-c2a37c62de4f\" (UID: \"800116ed-2c40-41e3-a141-c2a37c62de4f\") " Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.619002 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.621827 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.622994 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.623017 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/800116ed-2c40-41e3-a141-c2a37c62de4f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.624840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts" (OuterVolumeSpecName: "scripts") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.641602 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54" (OuterVolumeSpecName: "kube-api-access-bnl54") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "kube-api-access-bnl54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.659492 4760 scope.go:117] "RemoveContainer" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.712430 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.726300 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.726389 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnl54\" (UniqueName: \"kubernetes.io/projected/800116ed-2c40-41e3-a141-c2a37c62de4f-kube-api-access-bnl54\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.726409 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.750569 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.769404 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data" (OuterVolumeSpecName: "config-data") pod "800116ed-2c40-41e3-a141-c2a37c62de4f" (UID: "800116ed-2c40-41e3-a141-c2a37c62de4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.828665 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.829054 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/800116ed-2c40-41e3-a141-c2a37c62de4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.870659 4760 scope.go:117] "RemoveContainer" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.894751 4760 scope.go:117] "RemoveContainer" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.936116 4760 scope.go:117] "RemoveContainer" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.936880 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": container with ID starting with 10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a not found: ID does not exist" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.936917 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} err="failed to get container status \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": rpc error: code = NotFound desc = could not find container \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": container with ID starting with 10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.936945 4760 scope.go:117] "RemoveContainer" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.937373 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": container with ID starting with 58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb not found: ID does not exist" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.937445 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} err="failed to get container status \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": rpc error: code = NotFound desc = could not find container \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": container with ID starting with 58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.937491 4760 scope.go:117] "RemoveContainer" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.937856 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": container with ID starting with 996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b not found: ID does not exist" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.937897 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} err="failed to get container status \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": rpc error: code = NotFound desc = could not find container \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": container with ID starting with 996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.937916 4760 scope.go:117] "RemoveContainer" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.938433 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": container with ID starting with d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb not found: ID does not exist" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938462 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} err="failed to get container status \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": rpc error: code = NotFound desc = could not find container \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": container with ID starting with d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938478 4760 scope.go:117] "RemoveContainer" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938710 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} err="failed to get container status \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": rpc error: code = NotFound desc = could not find container \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": container with ID starting with 10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938730 4760 scope.go:117] "RemoveContainer" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938955 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} err="failed to get container status \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": rpc error: code = NotFound desc = could not find container \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": container with ID starting with 58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.938988 4760 scope.go:117] "RemoveContainer" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939262 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} err="failed to get container status \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": rpc error: code = NotFound desc = could not find container \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": container with ID starting with 996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939292 4760 scope.go:117] "RemoveContainer" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939537 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} err="failed to get container status \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": rpc error: code = NotFound desc = could not find container \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": container with ID starting with d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939565 4760 scope.go:117] "RemoveContainer" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939801 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} err="failed to get container status \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": rpc error: code = NotFound desc = could not find container \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": container with ID starting with 10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.939831 4760 scope.go:117] "RemoveContainer" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940062 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} err="failed to get container status \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": rpc error: code = NotFound desc = could not find container \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": container with ID starting with 58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940087 4760 scope.go:117] "RemoveContainer" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940509 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} err="failed to get container status \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": rpc error: code = NotFound desc = could not find container \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": container with ID starting with 996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940528 4760 scope.go:117] "RemoveContainer" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940732 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} err="failed to get container status \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": rpc error: code = NotFound desc = could not find container \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": container with ID starting with d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940755 4760 scope.go:117] "RemoveContainer" containerID="10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940952 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a"} err="failed to get container status \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": rpc error: code = NotFound desc = could not find container \"10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a\": container with ID starting with 10801a9dbc57fe6c8ca30c2ed58ed906b32e5a247c42f43a22f154748d33478a not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.940973 4760 scope.go:117] "RemoveContainer" containerID="58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.941176 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb"} err="failed to get container status \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": rpc error: code = NotFound desc = could not find container \"58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb\": container with ID starting with 58ed550dac0b2546460851f12573c9e1b640518f985c22b22244bc2d736573cb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.941223 4760 scope.go:117] "RemoveContainer" containerID="996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.941457 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b"} err="failed to get container status \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": rpc error: code = NotFound desc = could not find container \"996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b\": container with ID starting with 996855f05ddfd339099850e3520fd51eaa68268c4de3a1aad330b429142d983b not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.941481 4760 scope.go:117] "RemoveContainer" containerID="d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.941866 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb"} err="failed to get container status \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": rpc error: code = NotFound desc = could not find container \"d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb\": container with ID starting with d88fafe763362e5e12f4539315b6a143211600f53441d0e62692dbcd8b2b63bb not found: ID does not exist" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.943340 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.962846 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.979607 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.980227 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-notification-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980254 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-notification-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.980273 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="proxy-httpd" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980281 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="proxy-httpd" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.980306 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="sg-core" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980314 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="sg-core" Dec 04 12:39:38 crc kubenswrapper[4760]: E1204 12:39:38.980326 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-central-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980334 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-central-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980620 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="sg-core" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980648 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-notification-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980675 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="ceilometer-central-agent" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.980691 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" containerName="proxy-httpd" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.983035 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.986453 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.993513 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:38 crc kubenswrapper[4760]: I1204 12:39:38.994535 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.035241 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.035300 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.035346 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.035685 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.035879 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ff8\" (UniqueName: \"kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.036116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.036259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.138702 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.138813 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.138865 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ff8\" (UniqueName: \"kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.138932 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.138969 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.139052 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.139075 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.139501 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.139554 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.143953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.147039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.147115 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.152408 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.167999 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ff8\" (UniqueName: \"kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8\") pod \"ceilometer-0\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.310441 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.879388 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800116ed-2c40-41e3-a141-c2a37c62de4f" path="/var/lib/kubelet/pods/800116ed-2c40-41e3-a141-c2a37c62de4f/volumes" Dec 04 12:39:39 crc kubenswrapper[4760]: I1204 12:39:39.918631 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:40 crc kubenswrapper[4760]: I1204 12:39:40.136440 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:40 crc kubenswrapper[4760]: I1204 12:39:40.649793 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerStarted","Data":"a7f01111d410bff989711dcdc7060874f83cdec5bb7bd1211432f6859e219465"} Dec 04 12:39:41 crc kubenswrapper[4760]: I1204 12:39:41.663286 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerStarted","Data":"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22"} Dec 04 12:39:42 crc kubenswrapper[4760]: I1204 12:39:42.678608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerStarted","Data":"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2"} Dec 04 12:39:42 crc kubenswrapper[4760]: I1204 12:39:42.678976 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerStarted","Data":"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2"} Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.709684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerStarted","Data":"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d"} Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.710329 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.710031 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="proxy-httpd" containerID="cri-o://1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d" gracePeriod=30 Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.709997 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-central-agent" containerID="cri-o://7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22" gracePeriod=30 Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.710071 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="sg-core" containerID="cri-o://054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2" gracePeriod=30 Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.710076 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-notification-agent" containerID="cri-o://a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2" gracePeriod=30 Dec 04 12:39:44 crc kubenswrapper[4760]: I1204 12:39:44.751402 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.016057192 podStartE2EDuration="6.751365092s" podCreationTimestamp="2025-12-04 12:39:38 +0000 UTC" firstStartedPulling="2025-12-04 12:39:39.927584341 +0000 UTC m=+1582.969030908" lastFinishedPulling="2025-12-04 12:39:43.662892241 +0000 UTC m=+1586.704338808" observedRunningTime="2025-12-04 12:39:44.739235548 +0000 UTC m=+1587.780682115" watchObservedRunningTime="2025-12-04 12:39:44.751365092 +0000 UTC m=+1587.792811659" Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.861960 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerDied","Data":"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d"} Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.861912 4760 generic.go:334] "Generic (PLEG): container finished" podID="02375d88-fb0d-4659-9400-57346a8ab133" containerID="1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d" exitCode=0 Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.862352 4760 generic.go:334] "Generic (PLEG): container finished" podID="02375d88-fb0d-4659-9400-57346a8ab133" containerID="054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2" exitCode=2 Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.862363 4760 generic.go:334] "Generic (PLEG): container finished" podID="02375d88-fb0d-4659-9400-57346a8ab133" containerID="a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2" exitCode=0 Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.862382 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerDied","Data":"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2"} Dec 04 12:39:45 crc kubenswrapper[4760]: I1204 12:39:45.862416 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerDied","Data":"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2"} Dec 04 12:39:46 crc kubenswrapper[4760]: I1204 12:39:46.830863 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:39:46 crc kubenswrapper[4760]: I1204 12:39:46.876050 4760 generic.go:334] "Generic (PLEG): container finished" podID="1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" containerID="a480274240e5ff5d085247653af7f226604c60b910d2f6af1562ab79ff541799" exitCode=0 Dec 04 12:39:46 crc kubenswrapper[4760]: I1204 12:39:46.876152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" event={"ID":"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56","Type":"ContainerDied","Data":"a480274240e5ff5d085247653af7f226604c60b910d2f6af1562ab79ff541799"} Dec 04 12:39:47 crc kubenswrapper[4760]: I1204 12:39:47.785593 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054527 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054684 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054777 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054802 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054849 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054924 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9ff8\" (UniqueName: \"kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.054968 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle\") pod \"02375d88-fb0d-4659-9400-57346a8ab133\" (UID: \"02375d88-fb0d-4659-9400-57346a8ab133\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.062642 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.063969 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.070625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8" (OuterVolumeSpecName: "kube-api-access-p9ff8") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "kube-api-access-p9ff8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.098458 4760 generic.go:334] "Generic (PLEG): container finished" podID="02375d88-fb0d-4659-9400-57346a8ab133" containerID="7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22" exitCode=0 Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.098769 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.103776 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts" (OuterVolumeSpecName: "scripts") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.110846 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.157468 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.157508 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.157520 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.157531 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02375d88-fb0d-4659-9400-57346a8ab133-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.157541 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9ff8\" (UniqueName: \"kubernetes.io/projected/02375d88-fb0d-4659-9400-57346a8ab133-kube-api-access-p9ff8\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.182637 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerDied","Data":"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22"} Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.182954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02375d88-fb0d-4659-9400-57346a8ab133","Type":"ContainerDied","Data":"a7f01111d410bff989711dcdc7060874f83cdec5bb7bd1211432f6859e219465"} Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.182985 4760 scope.go:117] "RemoveContainer" containerID="1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.189737 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.213415 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data" (OuterVolumeSpecName: "config-data") pod "02375d88-fb0d-4659-9400-57346a8ab133" (UID: "02375d88-fb0d-4659-9400-57346a8ab133"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.229181 4760 scope.go:117] "RemoveContainer" containerID="054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.249876 4760 scope.go:117] "RemoveContainer" containerID="a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.259942 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.260015 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02375d88-fb0d-4659-9400-57346a8ab133-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.275491 4760 scope.go:117] "RemoveContainer" containerID="7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.303510 4760 scope.go:117] "RemoveContainer" containerID="1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.303985 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d\": container with ID starting with 1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d not found: ID does not exist" containerID="1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.304033 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d"} err="failed to get container status \"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d\": rpc error: code = NotFound desc = could not find container \"1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d\": container with ID starting with 1ea99dc0ab7185bc4fd8a6d13f8b1b0953dc69a25a04f71e245c096f883fdc9d not found: ID does not exist" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.304069 4760 scope.go:117] "RemoveContainer" containerID="054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.304647 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2\": container with ID starting with 054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2 not found: ID does not exist" containerID="054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.304684 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2"} err="failed to get container status \"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2\": rpc error: code = NotFound desc = could not find container \"054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2\": container with ID starting with 054c6a598c2cd436823372155319bbc8882a29f1c5c89832c39e346f6f1c19a2 not found: ID does not exist" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.304699 4760 scope.go:117] "RemoveContainer" containerID="a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.305113 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2\": container with ID starting with a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2 not found: ID does not exist" containerID="a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.305152 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2"} err="failed to get container status \"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2\": rpc error: code = NotFound desc = could not find container \"a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2\": container with ID starting with a790b164508339673bbb3b1acb344eca801bc633f7f64c1822cd71f9e30ca0e2 not found: ID does not exist" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.305168 4760 scope.go:117] "RemoveContainer" containerID="7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.305768 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22\": container with ID starting with 7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22 not found: ID does not exist" containerID="7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.305795 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22"} err="failed to get container status \"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22\": rpc error: code = NotFound desc = could not find container \"7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22\": container with ID starting with 7996f8cf8f16e9f4520c34d3a1f6b5d11f43034076f39172ad17120e212d5d22 not found: ID does not exist" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.513775 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.561606 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.583373 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.628584 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.629270 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="proxy-httpd" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629293 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="proxy-httpd" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.629315 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-central-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629321 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-central-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.629340 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="sg-core" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629347 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="sg-core" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.629358 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-notification-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629364 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-notification-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: E1204 12:39:48.629387 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" containerName="nova-cell0-conductor-db-sync" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629393 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" containerName="nova-cell0-conductor-db-sync" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629606 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-central-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629621 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="sg-core" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629645 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" containerName="nova-cell0-conductor-db-sync" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629656 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="ceilometer-notification-agent" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.629671 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="02375d88-fb0d-4659-9400-57346a8ab133" containerName="proxy-httpd" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.631857 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.635038 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.634606 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.650244 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.670031 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data\") pod \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.670121 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts\") pod \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.670425 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle\") pod \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.670492 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpk62\" (UniqueName: \"kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62\") pod \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\" (UID: \"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56\") " Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.686896 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts" (OuterVolumeSpecName: "scripts") pod "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" (UID: "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.687508 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62" (OuterVolumeSpecName: "kube-api-access-mpk62") pod "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" (UID: "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56"). InnerVolumeSpecName "kube-api-access-mpk62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.717514 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" (UID: "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.721383 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data" (OuterVolumeSpecName: "config-data") pod "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" (UID: "1ceeb719-ebc2-4aff-9b92-0e256d9c3c56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774834 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774907 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774932 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774952 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.774985 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxh5n\" (UniqueName: \"kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.775036 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.775119 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.775130 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.775139 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.775151 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpk62\" (UniqueName: \"kubernetes.io/projected/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56-kube-api-access-mpk62\") on node \"crc\" DevicePath \"\"" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877582 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877649 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877671 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxh5n\" (UniqueName: \"kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877747 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877838 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.877874 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.878465 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.878775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.882235 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.882502 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.882581 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.883394 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.898382 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxh5n\" (UniqueName: \"kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n\") pod \"ceilometer-0\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " pod="openstack/ceilometer-0" Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.957945 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:48 crc kubenswrapper[4760]: I1204 12:39:48.959150 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.120495 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" event={"ID":"1ceeb719-ebc2-4aff-9b92-0e256d9c3c56","Type":"ContainerDied","Data":"27b0a49e4f7b06d40141ca82bbc4beee0e5790157b936c6d1d9bf7138af63959"} Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.120846 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27b0a49e4f7b06d40141ca82bbc4beee0e5790157b936c6d1d9bf7138af63959" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.121149 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-jpsv5" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.193165 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.194804 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.201900 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.202356 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bpqcl" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.219979 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.297841 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vd68\" (UniqueName: \"kubernetes.io/projected/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-kube-api-access-8vd68\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.297904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.298344 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.333788 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5b7fc6c944-sh7tv" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.400931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vd68\" (UniqueName: \"kubernetes.io/projected/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-kube-api-access-8vd68\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.401032 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.401149 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.410237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.411865 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.414887 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.415435 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon-log" containerID="cri-o://80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec" gracePeriod=30 Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.416468 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" containerID="cri-o://7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba" gracePeriod=30 Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.440431 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vd68\" (UniqueName: \"kubernetes.io/projected/9f10a840-d1c4-4d92-bb37-5abe342cb4d1-kube-api-access-8vd68\") pod \"nova-cell0-conductor-0\" (UID: \"9f10a840-d1c4-4d92-bb37-5abe342cb4d1\") " pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.512286 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.523727 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:49 crc kubenswrapper[4760]: I1204 12:39:49.889160 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02375d88-fb0d-4659-9400-57346a8ab133" path="/var/lib/kubelet/pods/02375d88-fb0d-4659-9400-57346a8ab133/volumes" Dec 04 12:39:50 crc kubenswrapper[4760]: I1204 12:39:50.134772 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerStarted","Data":"d3b569aa39809a973b024c60732b410aaaa5b2da538871e1aedc416f1c3c5bef"} Dec 04 12:39:50 crc kubenswrapper[4760]: W1204 12:39:50.252715 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f10a840_d1c4_4d92_bb37_5abe342cb4d1.slice/crio-fa9ca320911b078717cd99c358a0ff5e9dae72727149add0a33c3058c9116797 WatchSource:0}: Error finding container fa9ca320911b078717cd99c358a0ff5e9dae72727149add0a33c3058c9116797: Status 404 returned error can't find the container with id fa9ca320911b078717cd99c358a0ff5e9dae72727149add0a33c3058c9116797 Dec 04 12:39:50 crc kubenswrapper[4760]: I1204 12:39:50.257548 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 12:39:51 crc kubenswrapper[4760]: I1204 12:39:51.147820 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f10a840-d1c4-4d92-bb37-5abe342cb4d1","Type":"ContainerStarted","Data":"fa9ca320911b078717cd99c358a0ff5e9dae72727149add0a33c3058c9116797"} Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.170667 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f10a840-d1c4-4d92-bb37-5abe342cb4d1","Type":"ContainerStarted","Data":"699b17ec60e45362d711fb9547de34e47945e315028ed64d04e39571315a7dcb"} Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.172869 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.173684 4760 generic.go:334] "Generic (PLEG): container finished" podID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerID="7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba" exitCode=0 Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.173738 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerDied","Data":"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba"} Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.173784 4760 scope.go:117] "RemoveContainer" containerID="850fe6326e562e31921afe14eb22f002e3b2f4fe609aaeedf11c8c3082f601e7" Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.203306 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=4.203277702 podStartE2EDuration="4.203277702s" podCreationTimestamp="2025-12-04 12:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:39:53.194850785 +0000 UTC m=+1596.236297372" watchObservedRunningTime="2025-12-04 12:39:53.203277702 +0000 UTC m=+1596.244724269" Dec 04 12:39:53 crc kubenswrapper[4760]: I1204 12:39:53.915479 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:39:54 crc kubenswrapper[4760]: I1204 12:39:54.210718 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerStarted","Data":"6b0c4da86413e530700e7924c2245eedb53c9d55c8f0c2cb34933ca53da1fef0"} Dec 04 12:39:55 crc kubenswrapper[4760]: I1204 12:39:55.224528 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerStarted","Data":"2888f40e7b4eb9990fb777346ba59278f0e6a9e7f2244b01c8a2a4bec78610a0"} Dec 04 12:39:56 crc kubenswrapper[4760]: I1204 12:39:56.244017 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerStarted","Data":"7ec56840e8a439785f9cdf01589c21dc6f5baf0064892e855bb3b2919ceb9bfd"} Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.259468 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerStarted","Data":"4115e8582cc9b056536a3e6305106121615cb4573a78c7e88e2553041522e265"} Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.259899 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-central-agent" containerID="cri-o://6b0c4da86413e530700e7924c2245eedb53c9d55c8f0c2cb34933ca53da1fef0" gracePeriod=30 Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.260444 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="proxy-httpd" containerID="cri-o://4115e8582cc9b056536a3e6305106121615cb4573a78c7e88e2553041522e265" gracePeriod=30 Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.260593 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-notification-agent" containerID="cri-o://2888f40e7b4eb9990fb777346ba59278f0e6a9e7f2244b01c8a2a4bec78610a0" gracePeriod=30 Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.260705 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="sg-core" containerID="cri-o://7ec56840e8a439785f9cdf01589c21dc6f5baf0064892e855bb3b2919ceb9bfd" gracePeriod=30 Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.259927 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:39:57 crc kubenswrapper[4760]: I1204 12:39:57.303431 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.686071228 podStartE2EDuration="9.303403463s" podCreationTimestamp="2025-12-04 12:39:48 +0000 UTC" firstStartedPulling="2025-12-04 12:39:49.799168035 +0000 UTC m=+1592.840614602" lastFinishedPulling="2025-12-04 12:39:56.41650026 +0000 UTC m=+1599.457946837" observedRunningTime="2025-12-04 12:39:57.290181154 +0000 UTC m=+1600.331627741" watchObservedRunningTime="2025-12-04 12:39:57.303403463 +0000 UTC m=+1600.344850030" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.274017 4760 generic.go:334] "Generic (PLEG): container finished" podID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerID="4115e8582cc9b056536a3e6305106121615cb4573a78c7e88e2553041522e265" exitCode=0 Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.274114 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerDied","Data":"4115e8582cc9b056536a3e6305106121615cb4573a78c7e88e2553041522e265"} Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.275642 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerDied","Data":"7ec56840e8a439785f9cdf01589c21dc6f5baf0064892e855bb3b2919ceb9bfd"} Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.275524 4760 generic.go:334] "Generic (PLEG): container finished" podID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerID="7ec56840e8a439785f9cdf01589c21dc6f5baf0064892e855bb3b2919ceb9bfd" exitCode=2 Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.276090 4760 generic.go:334] "Generic (PLEG): container finished" podID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerID="2888f40e7b4eb9990fb777346ba59278f0e6a9e7f2244b01c8a2a4bec78610a0" exitCode=0 Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.276143 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerDied","Data":"2888f40e7b4eb9990fb777346ba59278f0e6a9e7f2244b01c8a2a4bec78610a0"} Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.549967 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.566178 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.567885 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.689701 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.690109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6v4l\" (UniqueName: \"kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.690540 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.793476 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.793665 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.793738 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6v4l\" (UniqueName: \"kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.795035 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.795385 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.832406 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6v4l\" (UniqueName: \"kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l\") pod \"redhat-operators-9fm5q\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:58 crc kubenswrapper[4760]: I1204 12:39:58.902273 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:39:59 crc kubenswrapper[4760]: I1204 12:39:59.441866 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:39:59 crc kubenswrapper[4760]: I1204 12:39:59.568588 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.262565 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mtwzp"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.264230 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.266819 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.267107 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.289190 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mtwzp"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.300877 4760 generic.go:334] "Generic (PLEG): container finished" podID="59247d0a-4277-4c1f-b351-057d2b553de6" containerID="b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca" exitCode=0 Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.300943 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerDied","Data":"b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca"} Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.300983 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerStarted","Data":"6c397af490ef2141dd5b4ca95c1e69d60c67283613e60d1284e2c1cccdce2717"} Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.335072 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.335593 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvj64\" (UniqueName: \"kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.335735 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.336078 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.498389 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvj64\" (UniqueName: \"kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.498559 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.498670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.498831 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.508459 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.509951 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.519814 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.545034 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvj64\" (UniqueName: \"kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64\") pod \"nova-cell0-cell-mapping-mtwzp\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.586890 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.772010 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.775239 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.791919 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.812583 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.812682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.812769 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nlg\" (UniqueName: \"kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.812837 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.836069 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.915716 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.915879 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89nlg\" (UniqueName: \"kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.915991 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.916166 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.917655 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.923865 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.924766 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.926455 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.927675 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.947338 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.964156 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:00 crc kubenswrapper[4760]: I1204 12:40:00.971059 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nlg\" (UniqueName: \"kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg\") pod \"nova-api-0\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " pod="openstack/nova-api-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.014133 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.016203 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.021762 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.021847 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.021871 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.021945 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpnf4\" (UniqueName: \"kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.022016 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5qd\" (UniqueName: \"kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.022045 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.033953 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.057301 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.132744 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpnf4\" (UniqueName: \"kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.133033 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5qd\" (UniqueName: \"kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.133108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.133178 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.134309 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.136632 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.136679 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.142136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.142585 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.149178 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.150513 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.170587 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5qd\" (UniqueName: \"kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd\") pod \"nova-scheduler-0\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.170610 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpnf4\" (UniqueName: \"kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4\") pod \"nova-cell1-novncproxy-0\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.342197 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.398988 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.405159 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.436265 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.450034 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.477991 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.529300 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.531886 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.545937 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.546030 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6tk\" (UniqueName: \"kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.546133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.560654 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.596318 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.663743 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.665762 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.665820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6tk\" (UniqueName: \"kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.665926 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.665956 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.666031 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.666115 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v622n\" (UniqueName: \"kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.666144 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.666192 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.666331 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.685107 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mtwzp"] Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.693847 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.699023 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.709527 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6tk\" (UniqueName: \"kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.726153 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data\") pod \"nova-metadata-0\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.787360 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.792424 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.819947 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.821647 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.821863 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v622n\" (UniqueName: \"kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.821918 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.821992 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.822444 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.822618 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.823710 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.824350 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.826168 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:01 crc kubenswrapper[4760]: I1204 12:40:01.930038 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v622n\" (UniqueName: \"kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n\") pod \"dnsmasq-dns-6b6c754dc9-knjbd\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:01.972470 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.296226 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.376996 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.478981 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mtwzp" event={"ID":"1b979c43-e46a-4104-a17a-fb065693bbbc","Type":"ContainerStarted","Data":"7a2114d2834ec6fe3bd39fb3e1f184329858b0f7c9467c136be3c39f256382e4"} Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.482855 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerStarted","Data":"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72"} Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.485165 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerStarted","Data":"c140f453a7b7c7dcf49fb1ea8b4c24cd4b07181b2a13c865c073fb7f542b844d"} Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.495124 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dqc6v"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.498488 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.503874 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.504066 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.539319 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dqc6v"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.564425 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.661221 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4jzs\" (UniqueName: \"kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.661600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.661808 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.661840 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.764925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.764989 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.765029 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4jzs\" (UniqueName: \"kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.765054 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.778885 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.784322 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.787056 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.802479 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.942314 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:02 crc kubenswrapper[4760]: I1204 12:40:02.956873 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4jzs\" (UniqueName: \"kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs\") pod \"nova-cell1-conductor-db-sync-dqc6v\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.253109 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.390090 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.390509 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.578471 4760 generic.go:334] "Generic (PLEG): container finished" podID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerID="6b0c4da86413e530700e7924c2245eedb53c9d55c8f0c2cb34933ca53da1fef0" exitCode=0 Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.578576 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerDied","Data":"6b0c4da86413e530700e7924c2245eedb53c9d55c8f0c2cb34933ca53da1fef0"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.589607 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" event={"ID":"6d016d51-bdd1-490e-8751-a0c0bcf70f92","Type":"ContainerStarted","Data":"f6f70fbc25af9834aa36c4996a71929392fc662a56cbcea44894568e3f7d104e"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.591647 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerStarted","Data":"8cc14b1cc26fd973ca5b90befbebf6b990f490c0c8eab566ebcfc20683f7c5ab"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.596973 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d","Type":"ContainerStarted","Data":"d5873d6f2587e1e684bfec5a462a71a6551cd8b48cbf6a9073521a1238d98ca6"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.598886 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40fb9f3e-4d85-4625-b137-5b349cff347a","Type":"ContainerStarted","Data":"bf9d6c3b7a3b7df38403e3ed93b50ae709495c5fc05aad5b73ceaaabe6210fa4"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.622832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mtwzp" event={"ID":"1b979c43-e46a-4104-a17a-fb065693bbbc","Type":"ContainerStarted","Data":"1aadefe6022699d485dbabbf02459c00a1eb7dc7dbf3fc530898205522cdbaea"} Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.831724 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mtwzp" podStartSLOduration=3.831694342 podStartE2EDuration="3.831694342s" podCreationTimestamp="2025-12-04 12:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:03.661267551 +0000 UTC m=+1606.702714138" watchObservedRunningTime="2025-12-04 12:40:03.831694342 +0000 UTC m=+1606.873140909" Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.877794 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dqc6v"] Dec 04 12:40:03 crc kubenswrapper[4760]: I1204 12:40:03.914378 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.105321 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.208378 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.208761 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxh5n\" (UniqueName: \"kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.208894 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.208952 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.209065 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.209119 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.209202 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml\") pod \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\" (UID: \"80050c73-ddfc-4a3d-85ab-fe637021ed0b\") " Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.209768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.210574 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.211047 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.218431 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts" (OuterVolumeSpecName: "scripts") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.219647 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n" (OuterVolumeSpecName: "kube-api-access-sxh5n") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "kube-api-access-sxh5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.253446 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.312097 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80050c73-ddfc-4a3d-85ab-fe637021ed0b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.312646 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.312729 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxh5n\" (UniqueName: \"kubernetes.io/projected/80050c73-ddfc-4a3d-85ab-fe637021ed0b-kube-api-access-sxh5n\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.312834 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.337337 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.348530 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data" (OuterVolumeSpecName: "config-data") pod "80050c73-ddfc-4a3d-85ab-fe637021ed0b" (UID: "80050c73-ddfc-4a3d-85ab-fe637021ed0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.415250 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.415295 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80050c73-ddfc-4a3d-85ab-fe637021ed0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.814761 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80050c73-ddfc-4a3d-85ab-fe637021ed0b","Type":"ContainerDied","Data":"d3b569aa39809a973b024c60732b410aaaa5b2da538871e1aedc416f1c3c5bef"} Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.814819 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.814845 4760 scope.go:117] "RemoveContainer" containerID="4115e8582cc9b056536a3e6305106121615cb4573a78c7e88e2553041522e265" Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.826001 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" event={"ID":"6d016d51-bdd1-490e-8751-a0c0bcf70f92","Type":"ContainerDied","Data":"2c63fa4a332f036e276c1c4d578d261f67a08e11df39ad3fe7d6b5202ddaffab"} Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.827144 4760 generic.go:334] "Generic (PLEG): container finished" podID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerID="2c63fa4a332f036e276c1c4d578d261f67a08e11df39ad3fe7d6b5202ddaffab" exitCode=0 Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.832916 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" event={"ID":"20b1fcf1-5d52-4713-bfa5-2856128d6df5","Type":"ContainerStarted","Data":"11b57215c4038a5e5154e0aab357dfd0a47973cd6c77102069774c1bb55d6641"} Dec 04 12:40:04 crc kubenswrapper[4760]: I1204 12:40:04.844880 4760 scope.go:117] "RemoveContainer" containerID="7ec56840e8a439785f9cdf01589c21dc6f5baf0064892e855bb3b2919ceb9bfd" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.424569 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.454723 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.471654 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: E1204 12:40:05.472382 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-notification-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472459 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-notification-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: E1204 12:40:05.472483 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-central-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472543 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-central-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: E1204 12:40:05.472557 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="proxy-httpd" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472563 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="proxy-httpd" Dec 04 12:40:05 crc kubenswrapper[4760]: E1204 12:40:05.472578 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="sg-core" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472586 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="sg-core" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472901 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-notification-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472931 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="proxy-httpd" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472948 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="ceilometer-central-agent" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.472963 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" containerName="sg-core" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.475702 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.481676 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.488525 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.488600 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.495505 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.496131 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.496354 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.497147 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.497252 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.497306 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khwm8\" (UniqueName: \"kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.497346 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.563117 4760 scope.go:117] "RemoveContainer" containerID="2888f40e7b4eb9990fb777346ba59278f0e6a9e7f2244b01c8a2a4bec78610a0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.606833 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.606959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.607039 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khwm8\" (UniqueName: \"kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.607105 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.607324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.607524 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.607593 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.609106 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.609302 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.614439 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.618928 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.618979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.619360 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.620271 4760 scope.go:117] "RemoveContainer" containerID="6b0c4da86413e530700e7924c2245eedb53c9d55c8f0c2cb34933ca53da1fef0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.632136 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khwm8\" (UniqueName: \"kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8\") pod \"ceilometer-0\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: E1204 12:40:05.767817 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59247d0a_4277_4c1f_b351_057d2b553de6.slice/crio-conmon-cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80050c73_ddfc_4a3d_85ab_fe637021ed0b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80050c73_ddfc_4a3d_85ab_fe637021ed0b.slice/crio-d3b569aa39809a973b024c60732b410aaaa5b2da538871e1aedc416f1c3c5bef\": RecentStats: unable to find data in memory cache]" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.803233 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.839636 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.859915 4760 generic.go:334] "Generic (PLEG): container finished" podID="59247d0a-4277-4c1f-b351-057d2b553de6" containerID="cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72" exitCode=0 Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.860037 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerDied","Data":"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72"} Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.877320 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:05 crc kubenswrapper[4760]: I1204 12:40:05.892165 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80050c73-ddfc-4a3d-85ab-fe637021ed0b" path="/var/lib/kubelet/pods/80050c73-ddfc-4a3d-85ab-fe637021ed0b/volumes" Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.510187 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.891543 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerStarted","Data":"407461c8829dcb5f0bb81066ae0fb908cc21ecf603b1c55f107708e6446372a2"} Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.896772 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" event={"ID":"20b1fcf1-5d52-4713-bfa5-2856128d6df5","Type":"ContainerStarted","Data":"c897eb25114cb298991bc8c864cf2dbd15edd40eea2aa954f566c207bc2d2495"} Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.905748 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" event={"ID":"6d016d51-bdd1-490e-8751-a0c0bcf70f92","Type":"ContainerStarted","Data":"13775dd6e0320c6390ee1456def0a1bb1735174b5f463576bb1828d94176943f"} Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.906630 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.934085 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" podStartSLOduration=4.934053041 podStartE2EDuration="4.934053041s" podCreationTimestamp="2025-12-04 12:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:06.918988922 +0000 UTC m=+1609.960435489" watchObservedRunningTime="2025-12-04 12:40:06.934053041 +0000 UTC m=+1609.975499618" Dec 04 12:40:06 crc kubenswrapper[4760]: I1204 12:40:06.952458 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" podStartSLOduration=5.952423233 podStartE2EDuration="5.952423233s" podCreationTimestamp="2025-12-04 12:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:06.946551777 +0000 UTC m=+1609.987998344" watchObservedRunningTime="2025-12-04 12:40:06.952423233 +0000 UTC m=+1609.993869800" Dec 04 12:40:09 crc kubenswrapper[4760]: I1204 12:40:09.991109 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40fb9f3e-4d85-4625-b137-5b349cff347a","Type":"ContainerStarted","Data":"234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217"} Dec 04 12:40:09 crc kubenswrapper[4760]: I1204 12:40:09.993844 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerStarted","Data":"0d8d27dc22250fb37e78784e039221b4a75ca250f07552ba61ff5d08b405b602"} Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.012942 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerStarted","Data":"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9"} Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.037558 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerStarted","Data":"a792157cf013bc5dc2808482d4615f63029a4f83828c47fb5c32bea8fb35214a"} Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.039762 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.455334714 podStartE2EDuration="10.039730964s" podCreationTimestamp="2025-12-04 12:40:00 +0000 UTC" firstStartedPulling="2025-12-04 12:40:02.458752129 +0000 UTC m=+1605.500198696" lastFinishedPulling="2025-12-04 12:40:09.043148379 +0000 UTC m=+1612.084594946" observedRunningTime="2025-12-04 12:40:10.02291775 +0000 UTC m=+1613.064364317" watchObservedRunningTime="2025-12-04 12:40:10.039730964 +0000 UTC m=+1613.081177551" Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.048849 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d","Type":"ContainerStarted","Data":"50eeee381aaa92364b3d0d09c55e70284abc89a1b67d305bc306a63f2b1ac3d8"} Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.048917 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://50eeee381aaa92364b3d0d09c55e70284abc89a1b67d305bc306a63f2b1ac3d8" gracePeriod=30 Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.067056 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerStarted","Data":"76bb5374363a7257d8ee2ee7910b6702f58eeb681602aaed8d8238f81623389c"} Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.070187 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9fm5q" podStartSLOduration=3.294216444 podStartE2EDuration="12.07015349s" podCreationTimestamp="2025-12-04 12:39:58 +0000 UTC" firstStartedPulling="2025-12-04 12:40:00.311490489 +0000 UTC m=+1603.352937056" lastFinishedPulling="2025-12-04 12:40:09.087427525 +0000 UTC m=+1612.128874102" observedRunningTime="2025-12-04 12:40:10.044120464 +0000 UTC m=+1613.085567031" watchObservedRunningTime="2025-12-04 12:40:10.07015349 +0000 UTC m=+1613.111600057" Dec 04 12:40:10 crc kubenswrapper[4760]: I1204 12:40:10.081565 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.584272919 podStartE2EDuration="10.081536062s" podCreationTimestamp="2025-12-04 12:40:00 +0000 UTC" firstStartedPulling="2025-12-04 12:40:02.580317539 +0000 UTC m=+1605.621764106" lastFinishedPulling="2025-12-04 12:40:09.077580682 +0000 UTC m=+1612.119027249" observedRunningTime="2025-12-04 12:40:10.067153095 +0000 UTC m=+1613.108599662" watchObservedRunningTime="2025-12-04 12:40:10.081536062 +0000 UTC m=+1613.122982629" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.096394 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerStarted","Data":"d7926e426aeb2b6b236d18547df8553f5f67c594d92e66433ce5b6ceeb86e064"} Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.100888 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerStarted","Data":"9dfcdad4e4005771f32503159800c22803626635ebb640103bee254d3ff23e62"} Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.101177 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-log" containerID="cri-o://a792157cf013bc5dc2808482d4615f63029a4f83828c47fb5c32bea8fb35214a" gracePeriod=30 Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.101292 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-metadata" containerID="cri-o://9dfcdad4e4005771f32503159800c22803626635ebb640103bee254d3ff23e62" gracePeriod=30 Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.105030 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerStarted","Data":"792d70a6efb21a6ffb1b1f449222869028cde729d7e520d51b6d72bc5c62717c"} Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.137134 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.137517 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.194531 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.458853869 podStartE2EDuration="11.19450039s" podCreationTimestamp="2025-12-04 12:40:00 +0000 UTC" firstStartedPulling="2025-12-04 12:40:02.307535639 +0000 UTC m=+1605.348982206" lastFinishedPulling="2025-12-04 12:40:09.04318216 +0000 UTC m=+1612.084628727" observedRunningTime="2025-12-04 12:40:11.12738739 +0000 UTC m=+1614.168833957" watchObservedRunningTime="2025-12-04 12:40:11.19450039 +0000 UTC m=+1614.235946957" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.213529 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.075602947 podStartE2EDuration="10.213489423s" podCreationTimestamp="2025-12-04 12:40:01 +0000 UTC" firstStartedPulling="2025-12-04 12:40:02.949506088 +0000 UTC m=+1605.990952655" lastFinishedPulling="2025-12-04 12:40:09.087392564 +0000 UTC m=+1612.128839131" observedRunningTime="2025-12-04 12:40:11.154411177 +0000 UTC m=+1614.195857744" watchObservedRunningTime="2025-12-04 12:40:11.213489423 +0000 UTC m=+1614.254935990" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.351374 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.406783 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.406894 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.485686 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.851817 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.852203 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:11 crc kubenswrapper[4760]: I1204 12:40:11.983470 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.077847 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.078140 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56696ff475-64jjh" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="dnsmasq-dns" containerID="cri-o://a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d" gracePeriod=10 Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.183669 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerStarted","Data":"7aa7fc61ae5d0d8ac02910863a31254467a19710734535921c179e015468a44f"} Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.191611 4760 generic.go:334] "Generic (PLEG): container finished" podID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerID="9dfcdad4e4005771f32503159800c22803626635ebb640103bee254d3ff23e62" exitCode=0 Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.191659 4760 generic.go:334] "Generic (PLEG): container finished" podID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerID="a792157cf013bc5dc2808482d4615f63029a4f83828c47fb5c32bea8fb35214a" exitCode=143 Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.193021 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerDied","Data":"9dfcdad4e4005771f32503159800c22803626635ebb640103bee254d3ff23e62"} Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.193070 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerDied","Data":"a792157cf013bc5dc2808482d4615f63029a4f83828c47fb5c32bea8fb35214a"} Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.224650 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.226017 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:12 crc kubenswrapper[4760]: I1204 12:40:12.246778 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.143158 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.161531 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.225704 4760 generic.go:334] "Generic (PLEG): container finished" podID="120ed25b-47b0-4306-974b-a66255cac4ce" containerID="a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d" exitCode=0 Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.225894 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56696ff475-64jjh" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.227653 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-64jjh" event={"ID":"120ed25b-47b0-4306-974b-a66255cac4ce","Type":"ContainerDied","Data":"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d"} Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.227689 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56696ff475-64jjh" event={"ID":"120ed25b-47b0-4306-974b-a66255cac4ce","Type":"ContainerDied","Data":"9bfd0624544275c4a27f6e882d4a6d72319abe52ef59029ba612e3dc1e1b1590"} Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.227714 4760 scope.go:117] "RemoveContainer" containerID="a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.243435 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.243532 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3564a3c-2250-4ac6-b382-15adcf042f3f","Type":"ContainerDied","Data":"8cc14b1cc26fd973ca5b90befbebf6b990f490c0c8eab566ebcfc20683f7c5ab"} Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.284938 4760 scope.go:117] "RemoveContainer" containerID="39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299303 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299532 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299569 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle\") pod \"a3564a3c-2250-4ac6-b382-15adcf042f3f\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299634 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299689 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299727 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkt7\" (UniqueName: \"kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299853 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj6tk\" (UniqueName: \"kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk\") pod \"a3564a3c-2250-4ac6-b382-15adcf042f3f\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.299930 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data\") pod \"a3564a3c-2250-4ac6-b382-15adcf042f3f\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.300094 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs\") pod \"a3564a3c-2250-4ac6-b382-15adcf042f3f\" (UID: \"a3564a3c-2250-4ac6-b382-15adcf042f3f\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.300190 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb\") pod \"120ed25b-47b0-4306-974b-a66255cac4ce\" (UID: \"120ed25b-47b0-4306-974b-a66255cac4ce\") " Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.315831 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs" (OuterVolumeSpecName: "logs") pod "a3564a3c-2250-4ac6-b382-15adcf042f3f" (UID: "a3564a3c-2250-4ac6-b382-15adcf042f3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.316093 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk" (OuterVolumeSpecName: "kube-api-access-hj6tk") pod "a3564a3c-2250-4ac6-b382-15adcf042f3f" (UID: "a3564a3c-2250-4ac6-b382-15adcf042f3f"). InnerVolumeSpecName "kube-api-access-hj6tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.325224 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7" (OuterVolumeSpecName: "kube-api-access-ztkt7") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "kube-api-access-ztkt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.390403 4760 scope.go:117] "RemoveContainer" containerID="a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d" Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.398770 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d\": container with ID starting with a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d not found: ID does not exist" containerID="a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.398844 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d"} err="failed to get container status \"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d\": rpc error: code = NotFound desc = could not find container \"a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d\": container with ID starting with a5681be962ff6a5048d1fa4035b9cdca3f823dba5dcd9212fe08766b307a826d not found: ID does not exist" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.398887 4760 scope.go:117] "RemoveContainer" containerID="39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134" Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.399703 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134\": container with ID starting with 39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134 not found: ID does not exist" containerID="39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.399731 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134"} err="failed to get container status \"39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134\": rpc error: code = NotFound desc = could not find container \"39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134\": container with ID starting with 39bac2e10a92252d22e762328aa04cd0deaba3feaac81e1f2c3624fd1a86b134 not found: ID does not exist" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.399747 4760 scope.go:117] "RemoveContainer" containerID="9dfcdad4e4005771f32503159800c22803626635ebb640103bee254d3ff23e62" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.403744 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3564a3c-2250-4ac6-b382-15adcf042f3f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.403772 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkt7\" (UniqueName: \"kubernetes.io/projected/120ed25b-47b0-4306-974b-a66255cac4ce-kube-api-access-ztkt7\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.403784 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj6tk\" (UniqueName: \"kubernetes.io/projected/a3564a3c-2250-4ac6-b382-15adcf042f3f-kube-api-access-hj6tk\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.471459 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data" (OuterVolumeSpecName: "config-data") pod "a3564a3c-2250-4ac6-b382-15adcf042f3f" (UID: "a3564a3c-2250-4ac6-b382-15adcf042f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.477420 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3564a3c-2250-4ac6-b382-15adcf042f3f" (UID: "a3564a3c-2250-4ac6-b382-15adcf042f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.508003 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.532682 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config" (OuterVolumeSpecName: "config") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.538508 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.538555 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.538570 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3564a3c-2250-4ac6-b382-15adcf042f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.538580 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.548127 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.567459 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.569067 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "120ed25b-47b0-4306-974b-a66255cac4ce" (UID: "120ed25b-47b0-4306-974b-a66255cac4ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.643388 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.643443 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.643457 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/120ed25b-47b0-4306-974b-a66255cac4ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.669460 4760 scope.go:117] "RemoveContainer" containerID="a792157cf013bc5dc2808482d4615f63029a4f83828c47fb5c32bea8fb35214a" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.710489 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.735197 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.754876 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.755801 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-log" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.755836 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-log" Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.755857 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-metadata" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.755865 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-metadata" Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.755889 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="init" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.755896 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="init" Dec 04 12:40:13 crc kubenswrapper[4760]: E1204 12:40:13.755912 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="dnsmasq-dns" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.755921 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="dnsmasq-dns" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.756238 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-metadata" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.756265 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" containerName="nova-metadata-log" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.756299 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" containerName="dnsmasq-dns" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.757740 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.762963 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.764455 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.778553 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.850477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdml\" (UniqueName: \"kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.852477 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.852660 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.852824 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.852982 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.897911 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3564a3c-2250-4ac6-b382-15adcf042f3f" path="/var/lib/kubelet/pods/a3564a3c-2250-4ac6-b382-15adcf042f3f/volumes" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.916939 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66f8fb5648-87dff" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.917104 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.941337 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.955410 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.955470 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.955510 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.955541 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.955610 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdml\" (UniqueName: \"kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.960987 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.961280 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56696ff475-64jjh"] Dec 04 12:40:13 crc kubenswrapper[4760]: I1204 12:40:13.964087 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.094969 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.098239 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.104126 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdml\" (UniqueName: \"kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml\") pod \"nova-metadata-0\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " pod="openstack/nova-metadata-0" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.257241 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerStarted","Data":"d054fcb57225874f17f8657980e47b6d9fbfe5c045990f24232ac1a2c61e1272"} Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.259186 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.302337 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.524954698 podStartE2EDuration="9.302306981s" podCreationTimestamp="2025-12-04 12:40:05 +0000 UTC" firstStartedPulling="2025-12-04 12:40:06.553751069 +0000 UTC m=+1609.595197636" lastFinishedPulling="2025-12-04 12:40:13.331103352 +0000 UTC m=+1616.372549919" observedRunningTime="2025-12-04 12:40:14.284486765 +0000 UTC m=+1617.325933352" watchObservedRunningTime="2025-12-04 12:40:14.302306981 +0000 UTC m=+1617.343753558" Dec 04 12:40:14 crc kubenswrapper[4760]: I1204 12:40:14.380933 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:15 crc kubenswrapper[4760]: I1204 12:40:15.320569 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:15 crc kubenswrapper[4760]: I1204 12:40:15.911361 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="120ed25b-47b0-4306-974b-a66255cac4ce" path="/var/lib/kubelet/pods/120ed25b-47b0-4306-974b-a66255cac4ce/volumes" Dec 04 12:40:16 crc kubenswrapper[4760]: E1204 12:40:16.244545 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b979c43_e46a_4104_a17a_fb065693bbbc.slice/crio-conmon-1aadefe6022699d485dbabbf02459c00a1eb7dc7dbf3fc530898205522cdbaea.scope\": RecentStats: unable to find data in memory cache]" Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.289852 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerStarted","Data":"2acd12546b218fb8024d72e9b2a610182bc995daba258bb47ca8529b7a2a0c7d"} Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.290187 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerStarted","Data":"b282c7d886aa12a2f65f842d673a83ecc6290961fa7d9484a76d57802443cdd7"} Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.290202 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerStarted","Data":"3ed1703937e9bc423452958517222d2e932391067712b15d3126889cc7f00a79"} Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.300930 4760 generic.go:334] "Generic (PLEG): container finished" podID="1b979c43-e46a-4104-a17a-fb065693bbbc" containerID="1aadefe6022699d485dbabbf02459c00a1eb7dc7dbf3fc530898205522cdbaea" exitCode=0 Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.301511 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mtwzp" event={"ID":"1b979c43-e46a-4104-a17a-fb065693bbbc","Type":"ContainerDied","Data":"1aadefe6022699d485dbabbf02459c00a1eb7dc7dbf3fc530898205522cdbaea"} Dec 04 12:40:16 crc kubenswrapper[4760]: I1204 12:40:16.342579 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.342538594 podStartE2EDuration="3.342538594s" podCreationTimestamp="2025-12-04 12:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:16.3338951 +0000 UTC m=+1619.375341667" watchObservedRunningTime="2025-12-04 12:40:16.342538594 +0000 UTC m=+1619.383985161" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.798665 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.876530 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts\") pod \"1b979c43-e46a-4104-a17a-fb065693bbbc\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.876705 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle\") pod \"1b979c43-e46a-4104-a17a-fb065693bbbc\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.876755 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvj64\" (UniqueName: \"kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64\") pod \"1b979c43-e46a-4104-a17a-fb065693bbbc\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.876854 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data\") pod \"1b979c43-e46a-4104-a17a-fb065693bbbc\" (UID: \"1b979c43-e46a-4104-a17a-fb065693bbbc\") " Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.891409 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts" (OuterVolumeSpecName: "scripts") pod "1b979c43-e46a-4104-a17a-fb065693bbbc" (UID: "1b979c43-e46a-4104-a17a-fb065693bbbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.897136 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64" (OuterVolumeSpecName: "kube-api-access-hvj64") pod "1b979c43-e46a-4104-a17a-fb065693bbbc" (UID: "1b979c43-e46a-4104-a17a-fb065693bbbc"). InnerVolumeSpecName "kube-api-access-hvj64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.919271 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b979c43-e46a-4104-a17a-fb065693bbbc" (UID: "1b979c43-e46a-4104-a17a-fb065693bbbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.935192 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data" (OuterVolumeSpecName: "config-data") pod "1b979c43-e46a-4104-a17a-fb065693bbbc" (UID: "1b979c43-e46a-4104-a17a-fb065693bbbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.980160 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.980234 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.980252 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvj64\" (UniqueName: \"kubernetes.io/projected/1b979c43-e46a-4104-a17a-fb065693bbbc-kube-api-access-hvj64\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:17 crc kubenswrapper[4760]: I1204 12:40:17.980268 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b979c43-e46a-4104-a17a-fb065693bbbc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.323922 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mtwzp" event={"ID":"1b979c43-e46a-4104-a17a-fb065693bbbc","Type":"ContainerDied","Data":"7a2114d2834ec6fe3bd39fb3e1f184329858b0f7c9467c136be3c39f256382e4"} Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.323976 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a2114d2834ec6fe3bd39fb3e1f184329858b0f7c9467c136be3c39f256382e4" Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.324056 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mtwzp" Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.577783 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.578404 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-log" containerID="cri-o://0d8d27dc22250fb37e78784e039221b4a75ca250f07552ba61ff5d08b405b602" gracePeriod=30 Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.578589 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-api" containerID="cri-o://d7926e426aeb2b6b236d18547df8553f5f67c594d92e66433ce5b6ceeb86e064" gracePeriod=30 Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.626710 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.627103 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerName="nova-scheduler-scheduler" containerID="cri-o://234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" gracePeriod=30 Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.641487 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.641927 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-log" containerID="cri-o://b282c7d886aa12a2f65f842d673a83ecc6290961fa7d9484a76d57802443cdd7" gracePeriod=30 Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.642085 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-metadata" containerID="cri-o://2acd12546b218fb8024d72e9b2a610182bc995daba258bb47ca8529b7a2a0c7d" gracePeriod=30 Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.902620 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:18 crc kubenswrapper[4760]: I1204 12:40:18.904934 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.360077 4760 generic.go:334] "Generic (PLEG): container finished" podID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerID="2acd12546b218fb8024d72e9b2a610182bc995daba258bb47ca8529b7a2a0c7d" exitCode=0 Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.360405 4760 generic.go:334] "Generic (PLEG): container finished" podID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerID="b282c7d886aa12a2f65f842d673a83ecc6290961fa7d9484a76d57802443cdd7" exitCode=143 Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.360146 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerDied","Data":"2acd12546b218fb8024d72e9b2a610182bc995daba258bb47ca8529b7a2a0c7d"} Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.360573 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerDied","Data":"b282c7d886aa12a2f65f842d673a83ecc6290961fa7d9484a76d57802443cdd7"} Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.372192 4760 generic.go:334] "Generic (PLEG): container finished" podID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerID="0d8d27dc22250fb37e78784e039221b4a75ca250f07552ba61ff5d08b405b602" exitCode=143 Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.372291 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerDied","Data":"0d8d27dc22250fb37e78784e039221b4a75ca250f07552ba61ff5d08b405b602"} Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.389952 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.390033 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.800882 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.934407 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs\") pod \"db98f83c-86a8-4a9d-8fec-d608619a764d\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.934792 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data\") pod \"db98f83c-86a8-4a9d-8fec-d608619a764d\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.935061 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle\") pod \"db98f83c-86a8-4a9d-8fec-d608619a764d\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.935103 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rdml\" (UniqueName: \"kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml\") pod \"db98f83c-86a8-4a9d-8fec-d608619a764d\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.935174 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs\") pod \"db98f83c-86a8-4a9d-8fec-d608619a764d\" (UID: \"db98f83c-86a8-4a9d-8fec-d608619a764d\") " Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.936292 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs" (OuterVolumeSpecName: "logs") pod "db98f83c-86a8-4a9d-8fec-d608619a764d" (UID: "db98f83c-86a8-4a9d-8fec-d608619a764d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.937022 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db98f83c-86a8-4a9d-8fec-d608619a764d-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.941730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml" (OuterVolumeSpecName: "kube-api-access-9rdml") pod "db98f83c-86a8-4a9d-8fec-d608619a764d" (UID: "db98f83c-86a8-4a9d-8fec-d608619a764d"). InnerVolumeSpecName "kube-api-access-9rdml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.972518 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9fm5q" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="registry-server" probeResult="failure" output=< Dec 04 12:40:19 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 12:40:19 crc kubenswrapper[4760]: > Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.981580 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data" (OuterVolumeSpecName: "config-data") pod "db98f83c-86a8-4a9d-8fec-d608619a764d" (UID: "db98f83c-86a8-4a9d-8fec-d608619a764d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:19 crc kubenswrapper[4760]: I1204 12:40:19.990578 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db98f83c-86a8-4a9d-8fec-d608619a764d" (UID: "db98f83c-86a8-4a9d-8fec-d608619a764d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.014782 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "db98f83c-86a8-4a9d-8fec-d608619a764d" (UID: "db98f83c-86a8-4a9d-8fec-d608619a764d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.039783 4760 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.040069 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.040172 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db98f83c-86a8-4a9d-8fec-d608619a764d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.040286 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rdml\" (UniqueName: \"kubernetes.io/projected/db98f83c-86a8-4a9d-8fec-d608619a764d-kube-api-access-9rdml\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.109296 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.243370 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.243478 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.243630 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.243706 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.244226 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8gbs\" (UniqueName: \"kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.244259 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.244288 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs\") pod \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\" (UID: \"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc\") " Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.245407 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs" (OuterVolumeSpecName: "logs") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.252582 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs" (OuterVolumeSpecName: "kube-api-access-m8gbs") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "kube-api-access-m8gbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.253384 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.279878 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data" (OuterVolumeSpecName: "config-data") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.300014 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts" (OuterVolumeSpecName: "scripts") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.316768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.321863 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" (UID: "a1b21d48-8d8c-4c52-8be9-7a188fffa3cc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.347978 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8gbs\" (UniqueName: \"kubernetes.io/projected/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-kube-api-access-m8gbs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348331 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348346 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348359 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348370 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348381 4760 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.348390 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.387075 4760 generic.go:334] "Generic (PLEG): container finished" podID="20b1fcf1-5d52-4713-bfa5-2856128d6df5" containerID="c897eb25114cb298991bc8c864cf2dbd15edd40eea2aa954f566c207bc2d2495" exitCode=0 Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.388019 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" event={"ID":"20b1fcf1-5d52-4713-bfa5-2856128d6df5","Type":"ContainerDied","Data":"c897eb25114cb298991bc8c864cf2dbd15edd40eea2aa954f566c207bc2d2495"} Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.394152 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.394152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"db98f83c-86a8-4a9d-8fec-d608619a764d","Type":"ContainerDied","Data":"3ed1703937e9bc423452958517222d2e932391067712b15d3126889cc7f00a79"} Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.394433 4760 scope.go:117] "RemoveContainer" containerID="2acd12546b218fb8024d72e9b2a610182bc995daba258bb47ca8529b7a2a0c7d" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.404621 4760 generic.go:334] "Generic (PLEG): container finished" podID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerID="80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec" exitCode=137 Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.404695 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerDied","Data":"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec"} Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.404739 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66f8fb5648-87dff" event={"ID":"a1b21d48-8d8c-4c52-8be9-7a188fffa3cc","Type":"ContainerDied","Data":"be6dafefd520de142d8f1d0b12d74c215d160de63da908763acfe2c8e19d2785"} Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.404848 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66f8fb5648-87dff" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.430704 4760 scope.go:117] "RemoveContainer" containerID="b282c7d886aa12a2f65f842d673a83ecc6290961fa7d9484a76d57802443cdd7" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.464286 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.485737 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.492445 4760 scope.go:117] "RemoveContainer" containerID="7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.514563 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.550476 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66f8fb5648-87dff"] Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.580162 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.584556 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.584609 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.584627 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-log" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.584635 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-log" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.584650 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b979c43-e46a-4104-a17a-fb065693bbbc" containerName="nova-manage" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.584658 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b979c43-e46a-4104-a17a-fb065693bbbc" containerName="nova-manage" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.584699 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon-log" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.584708 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon-log" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.584749 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-metadata" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.584755 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-metadata" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585225 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585240 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b979c43-e46a-4104-a17a-fb065693bbbc" containerName="nova-manage" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585258 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-log" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585273 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon-log" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585285 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" containerName="nova-metadata-metadata" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.585508 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585516 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.585749 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" containerName="horizon" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.586764 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.589945 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.590261 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.593907 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.732882 4760 scope.go:117] "RemoveContainer" containerID="80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.771165 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.771433 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.771624 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.772127 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsk2g\" (UniqueName: \"kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.772286 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.794279 4760 scope.go:117] "RemoveContainer" containerID="7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.795304 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba\": container with ID starting with 7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba not found: ID does not exist" containerID="7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.795368 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba"} err="failed to get container status \"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba\": rpc error: code = NotFound desc = could not find container \"7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba\": container with ID starting with 7d23daa3695c05e91f983816cc34112a74f19fac6a473b97d6b33e450a45e9ba not found: ID does not exist" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.795413 4760 scope.go:117] "RemoveContainer" containerID="80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec" Dec 04 12:40:20 crc kubenswrapper[4760]: E1204 12:40:20.796050 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec\": container with ID starting with 80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec not found: ID does not exist" containerID="80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.796141 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec"} err="failed to get container status \"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec\": rpc error: code = NotFound desc = could not find container \"80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec\": container with ID starting with 80a7c1a16a12887d2bc7fc82c65eae85dd349b635cc800d8256aa64e0ca411ec not found: ID does not exist" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.874758 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsk2g\" (UniqueName: \"kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.874845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.874931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.875024 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.875108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.875895 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.881010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.881904 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.890076 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.899564 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsk2g\" (UniqueName: \"kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g\") pod \"nova-metadata-0\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " pod="openstack/nova-metadata-0" Dec 04 12:40:20 crc kubenswrapper[4760]: I1204 12:40:20.929124 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:40:21 crc kubenswrapper[4760]: E1204 12:40:21.407713 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217 is running failed: container process not found" containerID="234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:40:21 crc kubenswrapper[4760]: E1204 12:40:21.408779 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217 is running failed: container process not found" containerID="234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:40:21 crc kubenswrapper[4760]: E1204 12:40:21.410045 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217 is running failed: container process not found" containerID="234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:40:21 crc kubenswrapper[4760]: E1204 12:40:21.410087 4760 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerName="nova-scheduler-scheduler" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.427533 4760 generic.go:334] "Generic (PLEG): container finished" podID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerID="234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" exitCode=0 Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.427624 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40fb9f3e-4d85-4625-b137-5b349cff347a","Type":"ContainerDied","Data":"234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217"} Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.514437 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.544097 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.714628 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data\") pod \"40fb9f3e-4d85-4625-b137-5b349cff347a\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.714792 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle\") pod \"40fb9f3e-4d85-4625-b137-5b349cff347a\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.714964 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5qd\" (UniqueName: \"kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd\") pod \"40fb9f3e-4d85-4625-b137-5b349cff347a\" (UID: \"40fb9f3e-4d85-4625-b137-5b349cff347a\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.722349 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd" (OuterVolumeSpecName: "kube-api-access-dd5qd") pod "40fb9f3e-4d85-4625-b137-5b349cff347a" (UID: "40fb9f3e-4d85-4625-b137-5b349cff347a"). InnerVolumeSpecName "kube-api-access-dd5qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.757255 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data" (OuterVolumeSpecName: "config-data") pod "40fb9f3e-4d85-4625-b137-5b349cff347a" (UID: "40fb9f3e-4d85-4625-b137-5b349cff347a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.760920 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40fb9f3e-4d85-4625-b137-5b349cff347a" (UID: "40fb9f3e-4d85-4625-b137-5b349cff347a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.817655 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.817944 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb9f3e-4d85-4625-b137-5b349cff347a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.817959 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5qd\" (UniqueName: \"kubernetes.io/projected/40fb9f3e-4d85-4625-b137-5b349cff347a-kube-api-access-dd5qd\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.878671 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b21d48-8d8c-4c52-8be9-7a188fffa3cc" path="/var/lib/kubelet/pods/a1b21d48-8d8c-4c52-8be9-7a188fffa3cc/volumes" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.879787 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db98f83c-86a8-4a9d-8fec-d608619a764d" path="/var/lib/kubelet/pods/db98f83c-86a8-4a9d-8fec-d608619a764d/volumes" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.886573 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.919080 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts\") pod \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.919186 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data\") pod \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.919450 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4jzs\" (UniqueName: \"kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs\") pod \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.919592 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle\") pod \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\" (UID: \"20b1fcf1-5d52-4713-bfa5-2856128d6df5\") " Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.923486 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts" (OuterVolumeSpecName: "scripts") pod "20b1fcf1-5d52-4713-bfa5-2856128d6df5" (UID: "20b1fcf1-5d52-4713-bfa5-2856128d6df5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.937428 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs" (OuterVolumeSpecName: "kube-api-access-j4jzs") pod "20b1fcf1-5d52-4713-bfa5-2856128d6df5" (UID: "20b1fcf1-5d52-4713-bfa5-2856128d6df5"). InnerVolumeSpecName "kube-api-access-j4jzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.958324 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20b1fcf1-5d52-4713-bfa5-2856128d6df5" (UID: "20b1fcf1-5d52-4713-bfa5-2856128d6df5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:21 crc kubenswrapper[4760]: I1204 12:40:21.966037 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data" (OuterVolumeSpecName: "config-data") pod "20b1fcf1-5d52-4713-bfa5-2856128d6df5" (UID: "20b1fcf1-5d52-4713-bfa5-2856128d6df5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.021709 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.021744 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.021755 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b1fcf1-5d52-4713-bfa5-2856128d6df5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.021766 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4jzs\" (UniqueName: \"kubernetes.io/projected/20b1fcf1-5d52-4713-bfa5-2856128d6df5-kube-api-access-j4jzs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.445649 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" event={"ID":"20b1fcf1-5d52-4713-bfa5-2856128d6df5","Type":"ContainerDied","Data":"11b57215c4038a5e5154e0aab357dfd0a47973cd6c77102069774c1bb55d6641"} Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.445718 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b57215c4038a5e5154e0aab357dfd0a47973cd6c77102069774c1bb55d6641" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.446182 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dqc6v" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.449321 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40fb9f3e-4d85-4625-b137-5b349cff347a","Type":"ContainerDied","Data":"bf9d6c3b7a3b7df38403e3ed93b50ae709495c5fc05aad5b73ceaaabe6210fa4"} Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.449386 4760 scope.go:117] "RemoveContainer" containerID="234899fd828f9b03aad97ede6d69889d601da6f1e7ccce991771446f8f761217" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.449386 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.454965 4760 generic.go:334] "Generic (PLEG): container finished" podID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerID="d7926e426aeb2b6b236d18547df8553f5f67c594d92e66433ce5b6ceeb86e064" exitCode=0 Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.455039 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerDied","Data":"d7926e426aeb2b6b236d18547df8553f5f67c594d92e66433ce5b6ceeb86e064"} Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.467557 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerStarted","Data":"07050799506c33f4231ca15ac581a760732b8048f95c8abf59f71989beccfa9d"} Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.467616 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerStarted","Data":"c1f5fbc249d6788a027f19a2e72d504142248535ced221cafc0e26f99b92e1df"} Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.494637 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.524309 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.545465 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: E1204 12:40:22.547000 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerName="nova-scheduler-scheduler" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.547108 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerName="nova-scheduler-scheduler" Dec 04 12:40:22 crc kubenswrapper[4760]: E1204 12:40:22.547261 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b1fcf1-5d52-4713-bfa5-2856128d6df5" containerName="nova-cell1-conductor-db-sync" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.547344 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b1fcf1-5d52-4713-bfa5-2856128d6df5" containerName="nova-cell1-conductor-db-sync" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.547785 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" containerName="nova-scheduler-scheduler" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.548436 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b1fcf1-5d52-4713-bfa5-2856128d6df5" containerName="nova-cell1-conductor-db-sync" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.549979 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.555448 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.558565 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.558880 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.566253 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.580176 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.603662 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.660733 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.660812 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.660851 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2j2m\" (UniqueName: \"kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.660904 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48vl\" (UniqueName: \"kubernetes.io/projected/4e8ed881-2d3f-4939-b065-5b6860ad523d-kube-api-access-r48vl\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.661061 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.661169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765370 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48vl\" (UniqueName: \"kubernetes.io/projected/4e8ed881-2d3f-4939-b065-5b6860ad523d-kube-api-access-r48vl\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765608 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765730 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765836 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765871 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.765903 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2j2m\" (UniqueName: \"kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.776591 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.785904 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.790272 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e8ed881-2d3f-4939-b065-5b6860ad523d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.794066 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.796989 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48vl\" (UniqueName: \"kubernetes.io/projected/4e8ed881-2d3f-4939-b065-5b6860ad523d-kube-api-access-r48vl\") pod \"nova-cell1-conductor-0\" (UID: \"4e8ed881-2d3f-4939-b065-5b6860ad523d\") " pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.804795 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2j2m\" (UniqueName: \"kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m\") pod \"nova-scheduler-0\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.899176 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:40:22 crc kubenswrapper[4760]: I1204 12:40:22.921431 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.079673 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.176435 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle\") pod \"c46ccd08-f1b0-4c5a-afde-744a5286c129\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.176664 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs\") pod \"c46ccd08-f1b0-4c5a-afde-744a5286c129\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.176741 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data\") pod \"c46ccd08-f1b0-4c5a-afde-744a5286c129\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.176895 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89nlg\" (UniqueName: \"kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg\") pod \"c46ccd08-f1b0-4c5a-afde-744a5286c129\" (UID: \"c46ccd08-f1b0-4c5a-afde-744a5286c129\") " Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.180194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs" (OuterVolumeSpecName: "logs") pod "c46ccd08-f1b0-4c5a-afde-744a5286c129" (UID: "c46ccd08-f1b0-4c5a-afde-744a5286c129"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.191168 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg" (OuterVolumeSpecName: "kube-api-access-89nlg") pod "c46ccd08-f1b0-4c5a-afde-744a5286c129" (UID: "c46ccd08-f1b0-4c5a-afde-744a5286c129"). InnerVolumeSpecName "kube-api-access-89nlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.218402 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c46ccd08-f1b0-4c5a-afde-744a5286c129" (UID: "c46ccd08-f1b0-4c5a-afde-744a5286c129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.225617 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data" (OuterVolumeSpecName: "config-data") pod "c46ccd08-f1b0-4c5a-afde-744a5286c129" (UID: "c46ccd08-f1b0-4c5a-afde-744a5286c129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.283523 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89nlg\" (UniqueName: \"kubernetes.io/projected/c46ccd08-f1b0-4c5a-afde-744a5286c129-kube-api-access-89nlg\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.283590 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.283602 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c46ccd08-f1b0-4c5a-afde-744a5286c129-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.283612 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c46ccd08-f1b0-4c5a-afde-744a5286c129-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.351545 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.500360 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c46ccd08-f1b0-4c5a-afde-744a5286c129","Type":"ContainerDied","Data":"c140f453a7b7c7dcf49fb1ea8b4c24cd4b07181b2a13c865c073fb7f542b844d"} Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.500455 4760 scope.go:117] "RemoveContainer" containerID="d7926e426aeb2b6b236d18547df8553f5f67c594d92e66433ce5b6ceeb86e064" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.500637 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.513108 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerStarted","Data":"fc09e452bc279ff66d05279144734f7d866414ef534e87853507ed99fb1648df"} Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.517427 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4811fb25-573c-4a84-82ed-9bc01eb00341","Type":"ContainerStarted","Data":"4c7bbb7499e64883c8699e1fc0e2cfd8f3d039caabbc8204afdbc0447b67a63a"} Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.557452 4760 scope.go:117] "RemoveContainer" containerID="0d8d27dc22250fb37e78784e039221b4a75ca250f07552ba61ff5d08b405b602" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.559090 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.559071649 podStartE2EDuration="3.559071649s" podCreationTimestamp="2025-12-04 12:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:23.534232081 +0000 UTC m=+1626.575678648" watchObservedRunningTime="2025-12-04 12:40:23.559071649 +0000 UTC m=+1626.600518216" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.582726 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.599047 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.642359 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: E1204 12:40:23.645126 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-log" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.648527 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-log" Dec 04 12:40:23 crc kubenswrapper[4760]: E1204 12:40:23.648658 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-api" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.648666 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-api" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.649172 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-log" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.649226 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" containerName="nova-api-api" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.650739 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.708336 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.708848 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.727891 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.804506 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.804603 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.804691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.804919 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgj5p\" (UniqueName: \"kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.879441 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fb9f3e-4d85-4625-b137-5b349cff347a" path="/var/lib/kubelet/pods/40fb9f3e-4d85-4625-b137-5b349cff347a/volumes" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.880279 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46ccd08-f1b0-4c5a-afde-744a5286c129" path="/var/lib/kubelet/pods/c46ccd08-f1b0-4c5a-afde-744a5286c129/volumes" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.908537 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.908601 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.908669 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.908742 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgj5p\" (UniqueName: \"kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.909719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.913025 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.913640 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:23 crc kubenswrapper[4760]: I1204 12:40:23.929237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgj5p\" (UniqueName: \"kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p\") pod \"nova-api-0\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " pod="openstack/nova-api-0" Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.062151 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.535630 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4e8ed881-2d3f-4939-b065-5b6860ad523d","Type":"ContainerStarted","Data":"cead7d001223e51e1b6cffc85f48d324c51bff2dab9fa051a65228a5ea67cf4c"} Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.535952 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4e8ed881-2d3f-4939-b065-5b6860ad523d","Type":"ContainerStarted","Data":"e64d5c871d6a9e28af5357d515c5f687fd5ba5a6f991859d3e05d2b3fe7d1b4d"} Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.535977 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.537516 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4811fb25-573c-4a84-82ed-9bc01eb00341","Type":"ContainerStarted","Data":"fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2"} Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.558843 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.558815384 podStartE2EDuration="2.558815384s" podCreationTimestamp="2025-12-04 12:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:24.553787055 +0000 UTC m=+1627.595233622" watchObservedRunningTime="2025-12-04 12:40:24.558815384 +0000 UTC m=+1627.600261951" Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.576973 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.57694212 podStartE2EDuration="2.57694212s" podCreationTimestamp="2025-12-04 12:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:24.576801826 +0000 UTC m=+1627.618248393" watchObservedRunningTime="2025-12-04 12:40:24.57694212 +0000 UTC m=+1627.618388687" Dec 04 12:40:24 crc kubenswrapper[4760]: I1204 12:40:24.614008 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.566368 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerStarted","Data":"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17"} Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.566964 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerStarted","Data":"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199"} Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.566975 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerStarted","Data":"8b70346da0f30308d0bfc36fce8c285077fee674fbe52d4f3c0f67de66952868"} Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.600645 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.600618684 podStartE2EDuration="2.600618684s" podCreationTimestamp="2025-12-04 12:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:25.589278024 +0000 UTC m=+1628.630724611" watchObservedRunningTime="2025-12-04 12:40:25.600618684 +0000 UTC m=+1628.642065251" Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.929519 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:25 crc kubenswrapper[4760]: I1204 12:40:25.929599 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:40:27 crc kubenswrapper[4760]: I1204 12:40:27.900429 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 12:40:28 crc kubenswrapper[4760]: I1204 12:40:28.976196 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:29 crc kubenswrapper[4760]: I1204 12:40:29.035656 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:29 crc kubenswrapper[4760]: I1204 12:40:29.743278 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:40:30 crc kubenswrapper[4760]: I1204 12:40:30.657946 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9fm5q" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="registry-server" containerID="cri-o://4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9" gracePeriod=2 Dec 04 12:40:30 crc kubenswrapper[4760]: I1204 12:40:30.936913 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 12:40:30 crc kubenswrapper[4760]: I1204 12:40:30.937006 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.222075 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.415521 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities\") pod \"59247d0a-4277-4c1f-b351-057d2b553de6\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.415931 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content\") pod \"59247d0a-4277-4c1f-b351-057d2b553de6\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.416039 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6v4l\" (UniqueName: \"kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l\") pod \"59247d0a-4277-4c1f-b351-057d2b553de6\" (UID: \"59247d0a-4277-4c1f-b351-057d2b553de6\") " Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.416954 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities" (OuterVolumeSpecName: "utilities") pod "59247d0a-4277-4c1f-b351-057d2b553de6" (UID: "59247d0a-4277-4c1f-b351-057d2b553de6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.438676 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l" (OuterVolumeSpecName: "kube-api-access-r6v4l") pod "59247d0a-4277-4c1f-b351-057d2b553de6" (UID: "59247d0a-4277-4c1f-b351-057d2b553de6"). InnerVolumeSpecName "kube-api-access-r6v4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.517942 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.517994 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6v4l\" (UniqueName: \"kubernetes.io/projected/59247d0a-4277-4c1f-b351-057d2b553de6-kube-api-access-r6v4l\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.572661 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59247d0a-4277-4c1f-b351-057d2b553de6" (UID: "59247d0a-4277-4c1f-b351-057d2b553de6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.619915 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59247d0a-4277-4c1f-b351-057d2b553de6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.669837 4760 generic.go:334] "Generic (PLEG): container finished" podID="59247d0a-4277-4c1f-b351-057d2b553de6" containerID="4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9" exitCode=0 Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.669911 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerDied","Data":"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9"} Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.669953 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9fm5q" event={"ID":"59247d0a-4277-4c1f-b351-057d2b553de6","Type":"ContainerDied","Data":"6c397af490ef2141dd5b4ca95c1e69d60c67283613e60d1284e2c1cccdce2717"} Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.669980 4760 scope.go:117] "RemoveContainer" containerID="4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.670234 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9fm5q" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.703127 4760 scope.go:117] "RemoveContainer" containerID="cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.719909 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.730851 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9fm5q"] Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.737564 4760 scope.go:117] "RemoveContainer" containerID="b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.780621 4760 scope.go:117] "RemoveContainer" containerID="4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9" Dec 04 12:40:31 crc kubenswrapper[4760]: E1204 12:40:31.781874 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9\": container with ID starting with 4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9 not found: ID does not exist" containerID="4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.781920 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9"} err="failed to get container status \"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9\": rpc error: code = NotFound desc = could not find container \"4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9\": container with ID starting with 4b855b1818c27419e6607eb66688c18e8b926b22c4c9f4289c6966a2e01768d9 not found: ID does not exist" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.781954 4760 scope.go:117] "RemoveContainer" containerID="cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72" Dec 04 12:40:31 crc kubenswrapper[4760]: E1204 12:40:31.782309 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72\": container with ID starting with cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72 not found: ID does not exist" containerID="cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.782340 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72"} err="failed to get container status \"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72\": rpc error: code = NotFound desc = could not find container \"cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72\": container with ID starting with cd610a4c8e21180cfd4e2e0ba85821a07038662407e1f324b689d83d3e49da72 not found: ID does not exist" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.782362 4760 scope.go:117] "RemoveContainer" containerID="b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca" Dec 04 12:40:31 crc kubenswrapper[4760]: E1204 12:40:31.784550 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca\": container with ID starting with b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca not found: ID does not exist" containerID="b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.784633 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca"} err="failed to get container status \"b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca\": rpc error: code = NotFound desc = could not find container \"b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca\": container with ID starting with b0025b0c9308a64be3f167d18aa80f2323f9d48a287864f43e6d0aaf5058d7ca not found: ID does not exist" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.877087 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" path="/var/lib/kubelet/pods/59247d0a-4277-4c1f-b351-057d2b553de6/volumes" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.951030 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:31 crc kubenswrapper[4760]: I1204 12:40:31.951049 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:32 crc kubenswrapper[4760]: I1204 12:40:32.901165 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 12:40:32 crc kubenswrapper[4760]: I1204 12:40:32.945320 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 12:40:32 crc kubenswrapper[4760]: I1204 12:40:32.959724 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.381155 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.381324 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.381411 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.382593 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.382670 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" gracePeriod=600 Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.707801 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" exitCode=0 Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.707908 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a"} Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.708095 4760 scope.go:117] "RemoveContainer" containerID="47735eb331db95a9c8463c133a692889dc631bd67fa11179c4ea953bd5406acf" Dec 04 12:40:33 crc kubenswrapper[4760]: I1204 12:40:33.747837 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 12:40:34 crc kubenswrapper[4760]: E1204 12:40:34.011611 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:40:34 crc kubenswrapper[4760]: I1204 12:40:34.063310 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:40:34 crc kubenswrapper[4760]: I1204 12:40:34.063410 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:40:34 crc kubenswrapper[4760]: I1204 12:40:34.722720 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:40:34 crc kubenswrapper[4760]: E1204 12:40:34.723115 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:40:35 crc kubenswrapper[4760]: I1204 12:40:35.146782 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:35 crc kubenswrapper[4760]: I1204 12:40:35.146853 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:40:35 crc kubenswrapper[4760]: I1204 12:40:35.886835 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.491499 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.492634 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" containerName="kube-state-metrics" containerID="cri-o://61f3caafe7376c0e45c33ece0a9e92fd8c81d842dd36a3ecc67f258cd02a7e7a" gracePeriod=30 Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.807376 4760 generic.go:334] "Generic (PLEG): container finished" podID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" containerID="61f3caafe7376c0e45c33ece0a9e92fd8c81d842dd36a3ecc67f258cd02a7e7a" exitCode=2 Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.807486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8","Type":"ContainerDied","Data":"61f3caafe7376c0e45c33ece0a9e92fd8c81d842dd36a3ecc67f258cd02a7e7a"} Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.817426 4760 generic.go:334] "Generic (PLEG): container finished" podID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" containerID="50eeee381aaa92364b3d0d09c55e70284abc89a1b67d305bc306a63f2b1ac3d8" exitCode=137 Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.817509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d","Type":"ContainerDied","Data":"50eeee381aaa92364b3d0d09c55e70284abc89a1b67d305bc306a63f2b1ac3d8"} Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.946889 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.965059 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 12:40:40 crc kubenswrapper[4760]: I1204 12:40:40.985740 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.221492 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.232081 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.379967 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle\") pod \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.380149 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data\") pod \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.380223 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh66t\" (UniqueName: \"kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t\") pod \"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8\" (UID: \"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8\") " Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.380398 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpnf4\" (UniqueName: \"kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4\") pod \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\" (UID: \"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d\") " Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.388577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4" (OuterVolumeSpecName: "kube-api-access-qpnf4") pod "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" (UID: "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d"). InnerVolumeSpecName "kube-api-access-qpnf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.389913 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t" (OuterVolumeSpecName: "kube-api-access-rh66t") pod "d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" (UID: "d389a85e-f3e6-4ad5-b11d-9555d6aee3a8"). InnerVolumeSpecName "kube-api-access-rh66t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.421857 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" (UID: "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.436872 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data" (OuterVolumeSpecName: "config-data") pod "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" (UID: "2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.482451 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.482511 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh66t\" (UniqueName: \"kubernetes.io/projected/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8-kube-api-access-rh66t\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.482535 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpnf4\" (UniqueName: \"kubernetes.io/projected/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-kube-api-access-qpnf4\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.482550 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.834730 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d389a85e-f3e6-4ad5-b11d-9555d6aee3a8","Type":"ContainerDied","Data":"9b3d45ba2a0d36b16cc87922423097a3e0ee16471538f49a7d0f63218e5d43d3"} Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.834809 4760 scope.go:117] "RemoveContainer" containerID="61f3caafe7376c0e45c33ece0a9e92fd8c81d842dd36a3ecc67f258cd02a7e7a" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.834912 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.840439 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.840435 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d","Type":"ContainerDied","Data":"d5873d6f2587e1e684bfec5a462a71a6551cd8b48cbf6a9073521a1238d98ca6"} Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.929937 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.939339 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.971323 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:41 crc kubenswrapper[4760]: I1204 12:40:41.989406 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.019644 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.036619 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: E1204 12:40:42.037513 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.037538 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 12:40:42 crc kubenswrapper[4760]: E1204 12:40:42.037615 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" containerName="kube-state-metrics" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.037629 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" containerName="kube-state-metrics" Dec 04 12:40:42 crc kubenswrapper[4760]: E1204 12:40:42.037640 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="extract-content" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.037649 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="extract-content" Dec 04 12:40:42 crc kubenswrapper[4760]: E1204 12:40:42.037667 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="registry-server" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.037674 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="registry-server" Dec 04 12:40:42 crc kubenswrapper[4760]: E1204 12:40:42.037693 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="extract-utilities" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.037701 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="extract-utilities" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.038005 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="59247d0a-4277-4c1f-b351-057d2b553de6" containerName="registry-server" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.038033 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.038052 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" containerName="kube-state-metrics" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.039281 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.059736 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.060805 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.068412 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.096157 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.100934 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.150367 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.153015 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.153497 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.163325 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.202926 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqvp\" (UniqueName: \"kubernetes.io/projected/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-api-access-hvqvp\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203017 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203079 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqrl5\" (UniqueName: \"kubernetes.io/projected/0074865f-e381-4090-a72e-54eec164814e-kube-api-access-xqrl5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203137 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203240 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203313 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203358 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203386 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.203409 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306245 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqvp\" (UniqueName: \"kubernetes.io/projected/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-api-access-hvqvp\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306450 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqrl5\" (UniqueName: \"kubernetes.io/projected/0074865f-e381-4090-a72e-54eec164814e-kube-api-access-xqrl5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306599 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306732 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306859 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306939 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.306990 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.307028 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.333644 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.336435 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.350899 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cded8b4-450e-4646-b9c4-9df7334f5532-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.353997 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.355563 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqvp\" (UniqueName: \"kubernetes.io/projected/2cded8b4-450e-4646-b9c4-9df7334f5532-kube-api-access-hvqvp\") pod \"kube-state-metrics-0\" (UID: \"2cded8b4-450e-4646-b9c4-9df7334f5532\") " pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.356165 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.365141 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.378092 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.379770 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqrl5\" (UniqueName: \"kubernetes.io/projected/0074865f-e381-4090-a72e-54eec164814e-kube-api-access-xqrl5\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.381736 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0074865f-e381-4090-a72e-54eec164814e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0074865f-e381-4090-a72e-54eec164814e\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.384671 4760 scope.go:117] "RemoveContainer" containerID="50eeee381aaa92364b3d0d09c55e70284abc89a1b67d305bc306a63f2b1ac3d8" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.464562 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:42 crc kubenswrapper[4760]: I1204 12:40:42.937505 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.217763 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 12:40:43 crc kubenswrapper[4760]: W1204 12:40:43.219851 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0074865f_e381_4090_a72e_54eec164814e.slice/crio-c1f6221913374f3d9f550c66f8e8c63c1da76a8a0cd985989146d9bd87de19a3 WatchSource:0}: Error finding container c1f6221913374f3d9f550c66f8e8c63c1da76a8a0cd985989146d9bd87de19a3: Status 404 returned error can't find the container with id c1f6221913374f3d9f550c66f8e8c63c1da76a8a0cd985989146d9bd87de19a3 Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.877656 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d" path="/var/lib/kubelet/pods/2ccfe1b0-d50f-4db1-8df1-3401f30ccb7d/volumes" Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.879027 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d389a85e-f3e6-4ad5-b11d-9555d6aee3a8" path="/var/lib/kubelet/pods/d389a85e-f3e6-4ad5-b11d-9555d6aee3a8/volumes" Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.884796 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2cded8b4-450e-4646-b9c4-9df7334f5532","Type":"ContainerStarted","Data":"c434bf02f4aae56469c35fcdf107a1d9c6c3a87ea565257849657b066d1ab358"} Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.884866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2cded8b4-450e-4646-b9c4-9df7334f5532","Type":"ContainerStarted","Data":"c5320eb8cbc10151e4c553a76fa06551efd38933e4da5b2f5b85bbb6604a4d86"} Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.885019 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.887616 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0074865f-e381-4090-a72e-54eec164814e","Type":"ContainerStarted","Data":"cc4a4b463e748554a06e8598e51a735a7be6e8f494e438cd3ef792db14b2392f"} Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.887667 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0074865f-e381-4090-a72e-54eec164814e","Type":"ContainerStarted","Data":"c1f6221913374f3d9f550c66f8e8c63c1da76a8a0cd985989146d9bd87de19a3"} Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.910308 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.910732 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-central-agent" containerID="cri-o://76bb5374363a7257d8ee2ee7910b6702f58eeb681602aaed8d8238f81623389c" gracePeriod=30 Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.910924 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="proxy-httpd" containerID="cri-o://d054fcb57225874f17f8657980e47b6d9fbfe5c045990f24232ac1a2c61e1272" gracePeriod=30 Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.910967 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="sg-core" containerID="cri-o://7aa7fc61ae5d0d8ac02910863a31254467a19710734535921c179e015468a44f" gracePeriod=30 Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.911005 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-notification-agent" containerID="cri-o://792d70a6efb21a6ffb1b1f449222869028cde729d7e520d51b6d72bc5c62717c" gracePeriod=30 Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.919978 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.532385925 podStartE2EDuration="2.919956357s" podCreationTimestamp="2025-12-04 12:40:41 +0000 UTC" firstStartedPulling="2025-12-04 12:40:42.962384361 +0000 UTC m=+1646.003830928" lastFinishedPulling="2025-12-04 12:40:43.349954793 +0000 UTC m=+1646.391401360" observedRunningTime="2025-12-04 12:40:43.918006955 +0000 UTC m=+1646.959453522" watchObservedRunningTime="2025-12-04 12:40:43.919956357 +0000 UTC m=+1646.961402924" Dec 04 12:40:43 crc kubenswrapper[4760]: I1204 12:40:43.952740 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.952704336 podStartE2EDuration="2.952704336s" podCreationTimestamp="2025-12-04 12:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:43.942522803 +0000 UTC m=+1646.983969370" watchObservedRunningTime="2025-12-04 12:40:43.952704336 +0000 UTC m=+1646.994150913" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.071680 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.071773 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.072409 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.072478 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.076094 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.078798 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.340237 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.343019 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.411987 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529114 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529193 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqbg\" (UniqueName: \"kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529350 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529377 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529473 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.529499 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.632289 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.632777 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.632961 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.633108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqbg\" (UniqueName: \"kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.637641 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.637667 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.633366 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.642828 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.652537 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.654392 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.656738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.667547 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqbg\" (UniqueName: \"kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg\") pod \"dnsmasq-dns-5b4c997d87-ttkjm\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.720605 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.908449 4760 generic.go:334] "Generic (PLEG): container finished" podID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerID="d054fcb57225874f17f8657980e47b6d9fbfe5c045990f24232ac1a2c61e1272" exitCode=0 Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.908768 4760 generic.go:334] "Generic (PLEG): container finished" podID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerID="7aa7fc61ae5d0d8ac02910863a31254467a19710734535921c179e015468a44f" exitCode=2 Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.908779 4760 generic.go:334] "Generic (PLEG): container finished" podID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerID="76bb5374363a7257d8ee2ee7910b6702f58eeb681602aaed8d8238f81623389c" exitCode=0 Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.910056 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerDied","Data":"d054fcb57225874f17f8657980e47b6d9fbfe5c045990f24232ac1a2c61e1272"} Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.910093 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerDied","Data":"7aa7fc61ae5d0d8ac02910863a31254467a19710734535921c179e015468a44f"} Dec 04 12:40:44 crc kubenswrapper[4760]: I1204 12:40:44.910107 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerDied","Data":"76bb5374363a7257d8ee2ee7910b6702f58eeb681602aaed8d8238f81623389c"} Dec 04 12:40:45 crc kubenswrapper[4760]: I1204 12:40:45.434784 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:40:45 crc kubenswrapper[4760]: I1204 12:40:45.923363 4760 generic.go:334] "Generic (PLEG): container finished" podID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerID="5dcf7eeec99535ce79de885e26d28e82676115f5094fa8e367a51e5c8aef363b" exitCode=0 Dec 04 12:40:45 crc kubenswrapper[4760]: I1204 12:40:45.923427 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" event={"ID":"33a2d963-2a05-4e38-b7ee-fd3114f137c2","Type":"ContainerDied","Data":"5dcf7eeec99535ce79de885e26d28e82676115f5094fa8e367a51e5c8aef363b"} Dec 04 12:40:45 crc kubenswrapper[4760]: I1204 12:40:45.923776 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" event={"ID":"33a2d963-2a05-4e38-b7ee-fd3114f137c2","Type":"ContainerStarted","Data":"535dfb3d262d5672862849da243230c9bc29b85367ecaa55f44a90ce65b20328"} Dec 04 12:40:46 crc kubenswrapper[4760]: I1204 12:40:46.937509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" event={"ID":"33a2d963-2a05-4e38-b7ee-fd3114f137c2","Type":"ContainerStarted","Data":"3b534b3870c0455f46706b086c68798186dd666caa94582c32cfb3cba0feb8c0"} Dec 04 12:40:46 crc kubenswrapper[4760]: I1204 12:40:46.938954 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:46 crc kubenswrapper[4760]: I1204 12:40:46.974470 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" podStartSLOduration=2.974441545 podStartE2EDuration="2.974441545s" podCreationTimestamp="2025-12-04 12:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:46.96389513 +0000 UTC m=+1650.005341697" watchObservedRunningTime="2025-12-04 12:40:46.974441545 +0000 UTC m=+1650.015888112" Dec 04 12:40:47 crc kubenswrapper[4760]: I1204 12:40:47.465957 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.191728 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.192445 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-log" containerID="cri-o://54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199" gracePeriod=30 Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.193269 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-api" containerID="cri-o://676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17" gracePeriod=30 Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.970908 4760 generic.go:334] "Generic (PLEG): container finished" podID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerID="792d70a6efb21a6ffb1b1f449222869028cde729d7e520d51b6d72bc5c62717c" exitCode=0 Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.971342 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerDied","Data":"792d70a6efb21a6ffb1b1f449222869028cde729d7e520d51b6d72bc5c62717c"} Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.975816 4760 generic.go:334] "Generic (PLEG): container finished" podID="85936742-df3d-4545-a9c1-9b948f71f849" containerID="54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199" exitCode=143 Dec 04 12:40:48 crc kubenswrapper[4760]: I1204 12:40:48.975907 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerDied","Data":"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199"} Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.586767 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.790953 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791083 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791136 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791181 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791277 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khwm8\" (UniqueName: \"kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791333 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791434 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts\") pod \"99d6f493-b6f0-4340-9549-3c2c63e3c823\" (UID: \"99d6f493-b6f0-4340-9549-3c2c63e3c823\") " Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791556 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.791855 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.792642 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.792676 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99d6f493-b6f0-4340-9549-3c2c63e3c823-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.806532 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts" (OuterVolumeSpecName: "scripts") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.806780 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8" (OuterVolumeSpecName: "kube-api-access-khwm8") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "kube-api-access-khwm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.830879 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.874178 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:40:49 crc kubenswrapper[4760]: E1204 12:40:49.874657 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.894431 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.894470 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khwm8\" (UniqueName: \"kubernetes.io/projected/99d6f493-b6f0-4340-9549-3c2c63e3c823-kube-api-access-khwm8\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.894504 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.945644 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.946601 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data" (OuterVolumeSpecName: "config-data") pod "99d6f493-b6f0-4340-9549-3c2c63e3c823" (UID: "99d6f493-b6f0-4340-9549-3c2c63e3c823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.996598 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99d6f493-b6f0-4340-9549-3c2c63e3c823","Type":"ContainerDied","Data":"407461c8829dcb5f0bb81066ae0fb908cc21ecf603b1c55f107708e6446372a2"} Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.996667 4760 scope.go:117] "RemoveContainer" containerID="d054fcb57225874f17f8657980e47b6d9fbfe5c045990f24232ac1a2c61e1272" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.996816 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.996855 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:49 crc kubenswrapper[4760]: I1204 12:40:49.996863 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d6f493-b6f0-4340-9549-3c2c63e3c823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.031404 4760 scope.go:117] "RemoveContainer" containerID="7aa7fc61ae5d0d8ac02910863a31254467a19710734535921c179e015468a44f" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.044155 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.056075 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.059001 4760 scope.go:117] "RemoveContainer" containerID="792d70a6efb21a6ffb1b1f449222869028cde729d7e520d51b6d72bc5c62717c" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.068528 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:50 crc kubenswrapper[4760]: E1204 12:40:50.069039 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-notification-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069064 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-notification-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: E1204 12:40:50.069096 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="sg-core" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069104 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="sg-core" Dec 04 12:40:50 crc kubenswrapper[4760]: E1204 12:40:50.069114 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="proxy-httpd" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069120 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="proxy-httpd" Dec 04 12:40:50 crc kubenswrapper[4760]: E1204 12:40:50.069133 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-central-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069139 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-central-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069353 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="sg-core" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069369 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-notification-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069379 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="ceilometer-central-agent" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.069392 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" containerName="proxy-httpd" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.073646 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.077772 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.077940 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.078348 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.093856 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.099919 4760 scope.go:117] "RemoveContainer" containerID="76bb5374363a7257d8ee2ee7910b6702f58eeb681602aaed8d8238f81623389c" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201127 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201249 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201380 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201468 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201497 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201514 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201572 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.201634 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84s42\" (UniqueName: \"kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304419 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304528 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304596 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304643 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304669 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304726 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.304771 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84s42\" (UniqueName: \"kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.305656 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.305872 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.309383 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.309409 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.310429 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.310882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.311476 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.324007 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84s42\" (UniqueName: \"kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42\") pod \"ceilometer-0\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.395131 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:40:50 crc kubenswrapper[4760]: I1204 12:40:50.997054 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.877643 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.878113 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d6f493-b6f0-4340-9549-3c2c63e3c823" path="/var/lib/kubelet/pods/99d6f493-b6f0-4340-9549-3c2c63e3c823/volumes" Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.900527 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.972943 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data\") pod \"85936742-df3d-4545-a9c1-9b948f71f849\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.973098 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgj5p\" (UniqueName: \"kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p\") pod \"85936742-df3d-4545-a9c1-9b948f71f849\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.973151 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs\") pod \"85936742-df3d-4545-a9c1-9b948f71f849\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.973262 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle\") pod \"85936742-df3d-4545-a9c1-9b948f71f849\" (UID: \"85936742-df3d-4545-a9c1-9b948f71f849\") " Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.975327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs" (OuterVolumeSpecName: "logs") pod "85936742-df3d-4545-a9c1-9b948f71f849" (UID: "85936742-df3d-4545-a9c1-9b948f71f849"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:40:51 crc kubenswrapper[4760]: I1204 12:40:51.995618 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p" (OuterVolumeSpecName: "kube-api-access-sgj5p") pod "85936742-df3d-4545-a9c1-9b948f71f849" (UID: "85936742-df3d-4545-a9c1-9b948f71f849"). InnerVolumeSpecName "kube-api-access-sgj5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.040422 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data" (OuterVolumeSpecName: "config-data") pod "85936742-df3d-4545-a9c1-9b948f71f849" (UID: "85936742-df3d-4545-a9c1-9b948f71f849"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.049154 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85936742-df3d-4545-a9c1-9b948f71f849" (UID: "85936742-df3d-4545-a9c1-9b948f71f849"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.085085 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.085134 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85936742-df3d-4545-a9c1-9b948f71f849-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.085150 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgj5p\" (UniqueName: \"kubernetes.io/projected/85936742-df3d-4545-a9c1-9b948f71f849-kube-api-access-sgj5p\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.085163 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85936742-df3d-4545-a9c1-9b948f71f849-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.111030 4760 generic.go:334] "Generic (PLEG): container finished" podID="85936742-df3d-4545-a9c1-9b948f71f849" containerID="676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17" exitCode=0 Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.114630 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.116013 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerDied","Data":"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17"} Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.133116 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"85936742-df3d-4545-a9c1-9b948f71f849","Type":"ContainerDied","Data":"8b70346da0f30308d0bfc36fce8c285077fee674fbe52d4f3c0f67de66952868"} Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.133162 4760 scope.go:117] "RemoveContainer" containerID="676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.157092 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerStarted","Data":"cc9c955e02b743b1a7ce3fa1b212f049c713034a2965c5504fc8916542bf0565"} Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.333670 4760 scope.go:117] "RemoveContainer" containerID="54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.341290 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.347736 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.403322 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:52 crc kubenswrapper[4760]: E1204 12:40:52.404165 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-api" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.404194 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-api" Dec 04 12:40:52 crc kubenswrapper[4760]: E1204 12:40:52.404234 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-log" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.404244 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-log" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.404587 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-log" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.404605 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="85936742-df3d-4545-a9c1-9b948f71f849" containerName="nova-api-api" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.406368 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.407829 4760 scope.go:117] "RemoveContainer" containerID="676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.409535 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.409758 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 12:40:52 crc kubenswrapper[4760]: E1204 12:40:52.409940 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17\": container with ID starting with 676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17 not found: ID does not exist" containerID="676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.409979 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17"} err="failed to get container status \"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17\": rpc error: code = NotFound desc = could not find container \"676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17\": container with ID starting with 676c0399d66ff6f4c4ad04f123d4ab5c83141e5a5746538c5710c679386bac17 not found: ID does not exist" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.410019 4760 scope.go:117] "RemoveContainer" containerID="54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199" Dec 04 12:40:52 crc kubenswrapper[4760]: E1204 12:40:52.411063 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199\": container with ID starting with 54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199 not found: ID does not exist" containerID="54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.411094 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199"} err="failed to get container status \"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199\": rpc error: code = NotFound desc = could not find container \"54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199\": container with ID starting with 54654c80e84119f6514054de3941ff372724e417bd8163d4f880b136fb451199 not found: ID does not exist" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.417602 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.423281 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.449427 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.467969 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.510802 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c227\" (UniqueName: \"kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.511255 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.511285 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.511332 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.511361 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:52 crc kubenswrapper[4760]: I1204 12:40:52.511385 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.546242 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614463 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614536 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614608 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614650 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614672 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.614879 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c227\" (UniqueName: \"kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.617501 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.622758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.623452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.625017 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.639722 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c227\" (UniqueName: \"kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.644434 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data\") pod \"nova-api-0\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:52.753764 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.196322 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerStarted","Data":"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7"} Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.222644 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.417348 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2mn2p"] Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.442475 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.472121 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.472366 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.520298 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2mn2p"] Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.553463 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.553576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6dw\" (UniqueName: \"kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.553625 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.553710 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.576794 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.579799 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.636576 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.656156 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.658634 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.658859 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6dw\" (UniqueName: \"kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.658985 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.666510 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.668974 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.678924 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.689964 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6dw\" (UniqueName: \"kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw\") pod \"nova-cell1-cell-mapping-2mn2p\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.770338 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgqt6\" (UniqueName: \"kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.770657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.770737 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.819020 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.837462 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.875195 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.875571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgqt6\" (UniqueName: \"kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.875601 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.876500 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.876571 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.886634 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85936742-df3d-4545-a9c1-9b948f71f849" path="/var/lib/kubelet/pods/85936742-df3d-4545-a9c1-9b948f71f849/volumes" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.902572 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgqt6\" (UniqueName: \"kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6\") pod \"community-operators-6cgxg\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:53 crc kubenswrapper[4760]: I1204 12:40:53.937386 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.235438 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerStarted","Data":"ac0d42571b42f5c66c20510c8e1d6c223d7760d1ed753cc0294d886e5065f9ba"} Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.247608 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerStarted","Data":"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b"} Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.247666 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerStarted","Data":"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9"} Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.478754 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2mn2p"] Dec 04 12:40:54 crc kubenswrapper[4760]: W1204 12:40:54.493309 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9dacad_1c52_4236_9b38_e4d91fea47c8.slice/crio-e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425 WatchSource:0}: Error finding container e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425: Status 404 returned error can't find the container with id e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425 Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.724442 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.764284 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.836634 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:54 crc kubenswrapper[4760]: I1204 12:40:54.837663 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="dnsmasq-dns" containerID="cri-o://13775dd6e0320c6390ee1456def0a1bb1735174b5f463576bb1828d94176943f" gracePeriod=10 Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.271596 4760 generic.go:334] "Generic (PLEG): container finished" podID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerID="13775dd6e0320c6390ee1456def0a1bb1735174b5f463576bb1828d94176943f" exitCode=0 Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.271710 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" event={"ID":"6d016d51-bdd1-490e-8751-a0c0bcf70f92","Type":"ContainerDied","Data":"13775dd6e0320c6390ee1456def0a1bb1735174b5f463576bb1828d94176943f"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.276950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerStarted","Data":"0e1a9a88bbe82fd0670a4f427248688b4e2561809cfefc2ff63c45a6070273d8"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.284996 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2mn2p" event={"ID":"8b9dacad-1c52-4236-9b38-e4d91fea47c8","Type":"ContainerStarted","Data":"d2bc4a8ede2efb016db7afb420f19b4d99345b2acf77fc91ec1cbd68019c848d"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.285068 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2mn2p" event={"ID":"8b9dacad-1c52-4236-9b38-e4d91fea47c8","Type":"ContainerStarted","Data":"e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.296492 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerStarted","Data":"f90aad0c81b36f22c7a7603d165ff3c87a92dc9474cf39d0644aa554e5b3c210"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.296570 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerStarted","Data":"905777a4cc48ee01579b52b1194c91aff50c3a805effcfe68eab2f77c64df90c"} Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.333015 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.332979281 podStartE2EDuration="3.332979281s" podCreationTimestamp="2025-12-04 12:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:55.324676648 +0000 UTC m=+1658.366123225" watchObservedRunningTime="2025-12-04 12:40:55.332979281 +0000 UTC m=+1658.374425848" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.542631 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.643479 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.643788 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.643819 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.643885 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.643957 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v622n\" (UniqueName: \"kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.644169 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb\") pod \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\" (UID: \"6d016d51-bdd1-490e-8751-a0c0bcf70f92\") " Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.668528 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n" (OuterVolumeSpecName: "kube-api-access-v622n") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "kube-api-access-v622n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.733393 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.747473 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.747519 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v622n\" (UniqueName: \"kubernetes.io/projected/6d016d51-bdd1-490e-8751-a0c0bcf70f92-kube-api-access-v622n\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.762535 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config" (OuterVolumeSpecName: "config") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.834010 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.850828 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.852990 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.853023 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.853035 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.894978 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d016d51-bdd1-490e-8751-a0c0bcf70f92" (UID: "6d016d51-bdd1-490e-8751-a0c0bcf70f92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:40:55 crc kubenswrapper[4760]: I1204 12:40:55.955321 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d016d51-bdd1-490e-8751-a0c0bcf70f92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.309589 4760 generic.go:334] "Generic (PLEG): container finished" podID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerID="e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e" exitCode=0 Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.309729 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerDied","Data":"e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e"} Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.315777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerStarted","Data":"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928"} Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.316519 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.316559 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="sg-core" containerID="cri-o://3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b" gracePeriod=30 Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.316597 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="proxy-httpd" containerID="cri-o://f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928" gracePeriod=30 Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.316459 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-central-agent" containerID="cri-o://e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7" gracePeriod=30 Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.316787 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-notification-agent" containerID="cri-o://9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9" gracePeriod=30 Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.323316 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.323345 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6c754dc9-knjbd" event={"ID":"6d016d51-bdd1-490e-8751-a0c0bcf70f92","Type":"ContainerDied","Data":"f6f70fbc25af9834aa36c4996a71929392fc662a56cbcea44894568e3f7d104e"} Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.323865 4760 scope.go:117] "RemoveContainer" containerID="13775dd6e0320c6390ee1456def0a1bb1735174b5f463576bb1828d94176943f" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.358968 4760 scope.go:117] "RemoveContainer" containerID="2c63fa4a332f036e276c1c4d578d261f67a08e11df39ad3fe7d6b5202ddaffab" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.415818 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.853796821 podStartE2EDuration="6.415782762s" podCreationTimestamp="2025-12-04 12:40:50 +0000 UTC" firstStartedPulling="2025-12-04 12:40:51.022273406 +0000 UTC m=+1654.063719983" lastFinishedPulling="2025-12-04 12:40:55.584259347 +0000 UTC m=+1658.625705924" observedRunningTime="2025-12-04 12:40:56.373761688 +0000 UTC m=+1659.415208255" watchObservedRunningTime="2025-12-04 12:40:56.415782762 +0000 UTC m=+1659.457229329" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.436593 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2mn2p" podStartSLOduration=3.436514071 podStartE2EDuration="3.436514071s" podCreationTimestamp="2025-12-04 12:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:40:56.42136518 +0000 UTC m=+1659.462811767" watchObservedRunningTime="2025-12-04 12:40:56.436514071 +0000 UTC m=+1659.477960658" Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.464444 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:56 crc kubenswrapper[4760]: I1204 12:40:56.476089 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6c754dc9-knjbd"] Dec 04 12:40:57 crc kubenswrapper[4760]: I1204 12:40:57.355444 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerID="3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b" exitCode=2 Dec 04 12:40:57 crc kubenswrapper[4760]: I1204 12:40:57.355692 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerID="9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9" exitCode=0 Dec 04 12:40:57 crc kubenswrapper[4760]: I1204 12:40:57.355747 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerDied","Data":"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b"} Dec 04 12:40:57 crc kubenswrapper[4760]: I1204 12:40:57.355782 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerDied","Data":"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9"} Dec 04 12:40:57 crc kubenswrapper[4760]: I1204 12:40:57.882082 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" path="/var/lib/kubelet/pods/6d016d51-bdd1-490e-8751-a0c0bcf70f92/volumes" Dec 04 12:40:58 crc kubenswrapper[4760]: I1204 12:40:58.369047 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerStarted","Data":"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450"} Dec 04 12:40:59 crc kubenswrapper[4760]: I1204 12:40:59.384941 4760 generic.go:334] "Generic (PLEG): container finished" podID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerID="10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450" exitCode=0 Dec 04 12:40:59 crc kubenswrapper[4760]: I1204 12:40:59.385020 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerDied","Data":"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450"} Dec 04 12:41:00 crc kubenswrapper[4760]: I1204 12:41:00.402004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerStarted","Data":"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3"} Dec 04 12:41:00 crc kubenswrapper[4760]: I1204 12:41:00.449361 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6cgxg" podStartSLOduration=3.890343086 podStartE2EDuration="7.449325389s" podCreationTimestamp="2025-12-04 12:40:53 +0000 UTC" firstStartedPulling="2025-12-04 12:40:56.311506212 +0000 UTC m=+1659.352952779" lastFinishedPulling="2025-12-04 12:40:59.870488515 +0000 UTC m=+1662.911935082" observedRunningTime="2025-12-04 12:41:00.442125461 +0000 UTC m=+1663.483572028" watchObservedRunningTime="2025-12-04 12:41:00.449325389 +0000 UTC m=+1663.490771956" Dec 04 12:41:00 crc kubenswrapper[4760]: I1204 12:41:00.865302 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:00 crc kubenswrapper[4760]: E1204 12:41:00.865864 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:41:02 crc kubenswrapper[4760]: I1204 12:41:02.754604 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:41:02 crc kubenswrapper[4760]: I1204 12:41:02.755152 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.436853 4760 generic.go:334] "Generic (PLEG): container finished" podID="8b9dacad-1c52-4236-9b38-e4d91fea47c8" containerID="d2bc4a8ede2efb016db7afb420f19b4d99345b2acf77fc91ec1cbd68019c848d" exitCode=0 Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.436962 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2mn2p" event={"ID":"8b9dacad-1c52-4236-9b38-e4d91fea47c8","Type":"ContainerDied","Data":"d2bc4a8ede2efb016db7afb420f19b4d99345b2acf77fc91ec1cbd68019c848d"} Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.442996 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerID="e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7" exitCode=0 Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.443061 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerDied","Data":"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7"} Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.772379 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.772468 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.938721 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.938836 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:03 crc kubenswrapper[4760]: I1204 12:41:03.996249 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:04 crc kubenswrapper[4760]: I1204 12:41:04.910533 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.008062 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn6dw\" (UniqueName: \"kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw\") pod \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.008182 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts\") pod \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.008448 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data\") pod \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.008471 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle\") pod \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\" (UID: \"8b9dacad-1c52-4236-9b38-e4d91fea47c8\") " Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.026422 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts" (OuterVolumeSpecName: "scripts") pod "8b9dacad-1c52-4236-9b38-e4d91fea47c8" (UID: "8b9dacad-1c52-4236-9b38-e4d91fea47c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.045725 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw" (OuterVolumeSpecName: "kube-api-access-qn6dw") pod "8b9dacad-1c52-4236-9b38-e4d91fea47c8" (UID: "8b9dacad-1c52-4236-9b38-e4d91fea47c8"). InnerVolumeSpecName "kube-api-access-qn6dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.046499 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data" (OuterVolumeSpecName: "config-data") pod "8b9dacad-1c52-4236-9b38-e4d91fea47c8" (UID: "8b9dacad-1c52-4236-9b38-e4d91fea47c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.050183 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b9dacad-1c52-4236-9b38-e4d91fea47c8" (UID: "8b9dacad-1c52-4236-9b38-e4d91fea47c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.111716 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn6dw\" (UniqueName: \"kubernetes.io/projected/8b9dacad-1c52-4236-9b38-e4d91fea47c8-kube-api-access-qn6dw\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.111762 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.111773 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.111785 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9dacad-1c52-4236-9b38-e4d91fea47c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.466384 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2mn2p" event={"ID":"8b9dacad-1c52-4236-9b38-e4d91fea47c8","Type":"ContainerDied","Data":"e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425"} Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.466735 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e14923e3b78f4a3cfc9a61d01e689aadd150b721df7bde8303b7f83c6c6b1425" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.466467 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2mn2p" Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.667972 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.668709 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-log" containerID="cri-o://905777a4cc48ee01579b52b1194c91aff50c3a805effcfe68eab2f77c64df90c" gracePeriod=30 Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.669245 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-api" containerID="cri-o://f90aad0c81b36f22c7a7603d165ff3c87a92dc9474cf39d0644aa554e5b3c210" gracePeriod=30 Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.685485 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.685777 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerName="nova-scheduler-scheduler" containerID="cri-o://fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" gracePeriod=30 Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.725975 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.726326 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" containerID="cri-o://07050799506c33f4231ca15ac581a760732b8048f95c8abf59f71989beccfa9d" gracePeriod=30 Dec 04 12:41:05 crc kubenswrapper[4760]: I1204 12:41:05.726385 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" containerID="cri-o://fc09e452bc279ff66d05279144734f7d866414ef534e87853507ed99fb1648df" gracePeriod=30 Dec 04 12:41:06 crc kubenswrapper[4760]: I1204 12:41:06.481591 4760 generic.go:334] "Generic (PLEG): container finished" podID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerID="07050799506c33f4231ca15ac581a760732b8048f95c8abf59f71989beccfa9d" exitCode=143 Dec 04 12:41:06 crc kubenswrapper[4760]: I1204 12:41:06.481701 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerDied","Data":"07050799506c33f4231ca15ac581a760732b8048f95c8abf59f71989beccfa9d"} Dec 04 12:41:06 crc kubenswrapper[4760]: I1204 12:41:06.486149 4760 generic.go:334] "Generic (PLEG): container finished" podID="3d4f058e-0086-4779-8434-9de8045621fe" containerID="905777a4cc48ee01579b52b1194c91aff50c3a805effcfe68eab2f77c64df90c" exitCode=143 Dec 04 12:41:06 crc kubenswrapper[4760]: I1204 12:41:06.486319 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerDied","Data":"905777a4cc48ee01579b52b1194c91aff50c3a805effcfe68eab2f77c64df90c"} Dec 04 12:41:07 crc kubenswrapper[4760]: E1204 12:41:07.902665 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2 is running failed: container process not found" containerID="fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:41:07 crc kubenswrapper[4760]: E1204 12:41:07.903525 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2 is running failed: container process not found" containerID="fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:41:07 crc kubenswrapper[4760]: E1204 12:41:07.904905 4760 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2 is running failed: container process not found" containerID="fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 12:41:07 crc kubenswrapper[4760]: E1204 12:41:07.904957 4760 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerName="nova-scheduler-scheduler" Dec 04 12:41:08 crc kubenswrapper[4760]: I1204 12:41:08.514026 4760 generic.go:334] "Generic (PLEG): container finished" podID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerID="fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" exitCode=0 Dec 04 12:41:08 crc kubenswrapper[4760]: I1204 12:41:08.514096 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4811fb25-573c-4a84-82ed-9bc01eb00341","Type":"ContainerDied","Data":"fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2"} Dec 04 12:41:08 crc kubenswrapper[4760]: I1204 12:41:08.876287 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.014224 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data\") pod \"4811fb25-573c-4a84-82ed-9bc01eb00341\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.014442 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle\") pod \"4811fb25-573c-4a84-82ed-9bc01eb00341\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.014593 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2j2m\" (UniqueName: \"kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m\") pod \"4811fb25-573c-4a84-82ed-9bc01eb00341\" (UID: \"4811fb25-573c-4a84-82ed-9bc01eb00341\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.020713 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m" (OuterVolumeSpecName: "kube-api-access-w2j2m") pod "4811fb25-573c-4a84-82ed-9bc01eb00341" (UID: "4811fb25-573c-4a84-82ed-9bc01eb00341"). InnerVolumeSpecName "kube-api-access-w2j2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.050168 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4811fb25-573c-4a84-82ed-9bc01eb00341" (UID: "4811fb25-573c-4a84-82ed-9bc01eb00341"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.051321 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data" (OuterVolumeSpecName: "config-data") pod "4811fb25-573c-4a84-82ed-9bc01eb00341" (UID: "4811fb25-573c-4a84-82ed-9bc01eb00341"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.117322 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.117364 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2j2m\" (UniqueName: \"kubernetes.io/projected/4811fb25-573c-4a84-82ed-9bc01eb00341-kube-api-access-w2j2m\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.117383 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4811fb25-573c-4a84-82ed-9bc01eb00341-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.178476 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:39066->10.217.0.208:8775: read: connection reset by peer" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.178539 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:39052->10.217.0.208:8775: read: connection reset by peer" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.555586 4760 generic.go:334] "Generic (PLEG): container finished" podID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerID="fc09e452bc279ff66d05279144734f7d866414ef534e87853507ed99fb1648df" exitCode=0 Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.555947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerDied","Data":"fc09e452bc279ff66d05279144734f7d866414ef534e87853507ed99fb1648df"} Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.564618 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4811fb25-573c-4a84-82ed-9bc01eb00341","Type":"ContainerDied","Data":"4c7bbb7499e64883c8699e1fc0e2cfd8f3d039caabbc8204afdbc0447b67a63a"} Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.564688 4760 scope.go:117] "RemoveContainer" containerID="fb888c3ea7ace93a1a6fc5e31dc79ce70df4a995d69a14c2e02452aa8a12b4a2" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.564935 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.672134 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.692032 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.706580 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:09 crc kubenswrapper[4760]: E1204 12:41:09.707384 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="dnsmasq-dns" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707411 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="dnsmasq-dns" Dec 04 12:41:09 crc kubenswrapper[4760]: E1204 12:41:09.707434 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9dacad-1c52-4236-9b38-e4d91fea47c8" containerName="nova-manage" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707441 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9dacad-1c52-4236-9b38-e4d91fea47c8" containerName="nova-manage" Dec 04 12:41:09 crc kubenswrapper[4760]: E1204 12:41:09.707471 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="init" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707477 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="init" Dec 04 12:41:09 crc kubenswrapper[4760]: E1204 12:41:09.707501 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerName="nova-scheduler-scheduler" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707509 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerName="nova-scheduler-scheduler" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707734 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" containerName="nova-scheduler-scheduler" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707753 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9dacad-1c52-4236-9b38-e4d91fea47c8" containerName="nova-manage" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.707766 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d016d51-bdd1-490e-8751-a0c0bcf70f92" containerName="dnsmasq-dns" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.708742 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.714705 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.716460 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.743966 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrs6\" (UniqueName: \"kubernetes.io/projected/deac79a1-f615-4bba-b00f-11784b824094-kube-api-access-rhrs6\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.744022 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-config-data\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.744121 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.796749 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.846796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.847067 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrs6\" (UniqueName: \"kubernetes.io/projected/deac79a1-f615-4bba-b00f-11784b824094-kube-api-access-rhrs6\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.847101 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-config-data\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.854176 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-config-data\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.866000 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deac79a1-f615-4bba-b00f-11784b824094-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.870079 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrs6\" (UniqueName: \"kubernetes.io/projected/deac79a1-f615-4bba-b00f-11784b824094-kube-api-access-rhrs6\") pod \"nova-scheduler-0\" (UID: \"deac79a1-f615-4bba-b00f-11784b824094\") " pod="openstack/nova-scheduler-0" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.907895 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4811fb25-573c-4a84-82ed-9bc01eb00341" path="/var/lib/kubelet/pods/4811fb25-573c-4a84-82ed-9bc01eb00341/volumes" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.948368 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data\") pod \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.949010 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsk2g\" (UniqueName: \"kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g\") pod \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.949134 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs\") pod \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.949349 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle\") pod \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.949590 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs\") pod \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\" (UID: \"28ce9eb2-8a0c-4215-86f1-4e012455caa1\") " Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.951750 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs" (OuterVolumeSpecName: "logs") pod "28ce9eb2-8a0c-4215-86f1-4e012455caa1" (UID: "28ce9eb2-8a0c-4215-86f1-4e012455caa1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.952441 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g" (OuterVolumeSpecName: "kube-api-access-nsk2g") pod "28ce9eb2-8a0c-4215-86f1-4e012455caa1" (UID: "28ce9eb2-8a0c-4215-86f1-4e012455caa1"). InnerVolumeSpecName "kube-api-access-nsk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.986016 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data" (OuterVolumeSpecName: "config-data") pod "28ce9eb2-8a0c-4215-86f1-4e012455caa1" (UID: "28ce9eb2-8a0c-4215-86f1-4e012455caa1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:09 crc kubenswrapper[4760]: I1204 12:41:09.996659 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28ce9eb2-8a0c-4215-86f1-4e012455caa1" (UID: "28ce9eb2-8a0c-4215-86f1-4e012455caa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.022422 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "28ce9eb2-8a0c-4215-86f1-4e012455caa1" (UID: "28ce9eb2-8a0c-4215-86f1-4e012455caa1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.052707 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.052750 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsk2g\" (UniqueName: \"kubernetes.io/projected/28ce9eb2-8a0c-4215-86f1-4e012455caa1-kube-api-access-nsk2g\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.052761 4760 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.052771 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ce9eb2-8a0c-4215-86f1-4e012455caa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.052780 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ce9eb2-8a0c-4215-86f1-4e012455caa1-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.112076 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.585280 4760 generic.go:334] "Generic (PLEG): container finished" podID="3d4f058e-0086-4779-8434-9de8045621fe" containerID="f90aad0c81b36f22c7a7603d165ff3c87a92dc9474cf39d0644aa554e5b3c210" exitCode=0 Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.585390 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerDied","Data":"f90aad0c81b36f22c7a7603d165ff3c87a92dc9474cf39d0644aa554e5b3c210"} Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.595127 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ce9eb2-8a0c-4215-86f1-4e012455caa1","Type":"ContainerDied","Data":"c1f5fbc249d6788a027f19a2e72d504142248535ced221cafc0e26f99b92e1df"} Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.595207 4760 scope.go:117] "RemoveContainer" containerID="fc09e452bc279ff66d05279144734f7d866414ef534e87853507ed99fb1648df" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.595375 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.623682 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.635767 4760 scope.go:117] "RemoveContainer" containerID="07050799506c33f4231ca15ac581a760732b8048f95c8abf59f71989beccfa9d" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.657962 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.679869 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.700078 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:10 crc kubenswrapper[4760]: E1204 12:41:10.703791 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.703831 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" Dec 04 12:41:10 crc kubenswrapper[4760]: E1204 12:41:10.703903 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.703910 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.704176 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-metadata" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.704226 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" containerName="nova-metadata-log" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.705612 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.708938 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.711233 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.739728 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.764672 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.874857 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.875092 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.875306 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c227\" (UniqueName: \"kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.876118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.876199 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.876247 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs\") pod \"3d4f058e-0086-4779-8434-9de8045621fe\" (UID: \"3d4f058e-0086-4779-8434-9de8045621fe\") " Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.885247 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.885437 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxc6m\" (UniqueName: \"kubernetes.io/projected/d7631823-c503-49fd-85ac-ec4b8bc18a5b-kube-api-access-kxc6m\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.885489 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-config-data\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.885515 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.885610 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7631823-c503-49fd-85ac-ec4b8bc18a5b-logs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.886772 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs" (OuterVolumeSpecName: "logs") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.896128 4760 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d4f058e-0086-4779-8434-9de8045621fe-logs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.936763 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227" (OuterVolumeSpecName: "kube-api-access-4c227") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "kube-api-access-4c227". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.997876 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxc6m\" (UniqueName: \"kubernetes.io/projected/d7631823-c503-49fd-85ac-ec4b8bc18a5b-kube-api-access-kxc6m\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.997931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-config-data\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.997956 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.998017 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7631823-c503-49fd-85ac-ec4b8bc18a5b-logs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.998263 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:10 crc kubenswrapper[4760]: I1204 12:41:10.998363 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c227\" (UniqueName: \"kubernetes.io/projected/3d4f058e-0086-4779-8434-9de8045621fe-kube-api-access-4c227\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.000528 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data" (OuterVolumeSpecName: "config-data") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.003731 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7631823-c503-49fd-85ac-ec4b8bc18a5b-logs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.009586 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-config-data\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.011670 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.017611 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.057427 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxc6m\" (UniqueName: \"kubernetes.io/projected/d7631823-c503-49fd-85ac-ec4b8bc18a5b-kube-api-access-kxc6m\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.086334 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7631823-c503-49fd-85ac-ec4b8bc18a5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7631823-c503-49fd-85ac-ec4b8bc18a5b\") " pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.087132 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.106248 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.106312 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.115504 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.120336 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d4f058e-0086-4779-8434-9de8045621fe" (UID: "3d4f058e-0086-4779-8434-9de8045621fe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.209467 4760 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.209501 4760 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d4f058e-0086-4779-8434-9de8045621fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.615435 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"deac79a1-f615-4bba-b00f-11784b824094","Type":"ContainerStarted","Data":"0f53f603f1b8127c199640d26ebb87808df46736dc0110e0fbdd0b905019b15c"} Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.615776 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"deac79a1-f615-4bba-b00f-11784b824094","Type":"ContainerStarted","Data":"fa49045c4a30fc3509ae1caae94e7daab3c1391bf48b350cc5d70dc68cae05b3"} Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.621165 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d4f058e-0086-4779-8434-9de8045621fe","Type":"ContainerDied","Data":"ac0d42571b42f5c66c20510c8e1d6c223d7760d1ed753cc0294d886e5065f9ba"} Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.621260 4760 scope.go:117] "RemoveContainer" containerID="f90aad0c81b36f22c7a7603d165ff3c87a92dc9474cf39d0644aa554e5b3c210" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.621292 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.641049 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6410144989999997 podStartE2EDuration="2.641014499s" podCreationTimestamp="2025-12-04 12:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:41:11.634276155 +0000 UTC m=+1674.675722722" watchObservedRunningTime="2025-12-04 12:41:11.641014499 +0000 UTC m=+1674.682461066" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.656499 4760 scope.go:117] "RemoveContainer" containerID="905777a4cc48ee01579b52b1194c91aff50c3a805effcfe68eab2f77c64df90c" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.685907 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.703775 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.715665 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:11 crc kubenswrapper[4760]: E1204 12:41:11.716502 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-api" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.716534 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-api" Dec 04 12:41:11 crc kubenswrapper[4760]: E1204 12:41:11.716671 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-log" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.716687 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-log" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.716974 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-log" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.717001 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4f058e-0086-4779-8434-9de8045621fe" containerName="nova-api-api" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.718689 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.723682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.723758 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-logs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.723791 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-config-data\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.723979 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjzb\" (UniqueName: \"kubernetes.io/projected/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-kube-api-access-hcjzb\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.724036 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.724055 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.724383 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.724606 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.724662 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.748432 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:11 crc kubenswrapper[4760]: W1204 12:41:11.778784 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7631823_c503_49fd_85ac_ec4b8bc18a5b.slice/crio-4500aa3e3ad2ab50221b0a6299df58debd13f94e1e9ef5979a2da2a5319bd5e8 WatchSource:0}: Error finding container 4500aa3e3ad2ab50221b0a6299df58debd13f94e1e9ef5979a2da2a5319bd5e8: Status 404 returned error can't find the container with id 4500aa3e3ad2ab50221b0a6299df58debd13f94e1e9ef5979a2da2a5319bd5e8 Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.780171 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.827421 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjzb\" (UniqueName: \"kubernetes.io/projected/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-kube-api-access-hcjzb\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.827579 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.827609 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.828831 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.828896 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-logs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.828946 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-config-data\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.829557 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-logs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.835224 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.836858 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-config-data\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.838695 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.839361 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.853662 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjzb\" (UniqueName: \"kubernetes.io/projected/3f4c620c-1c1a-4aa3-aa92-5df1a205e70d-kube-api-access-hcjzb\") pod \"nova-api-0\" (UID: \"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d\") " pod="openstack/nova-api-0" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.882502 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ce9eb2-8a0c-4215-86f1-4e012455caa1" path="/var/lib/kubelet/pods/28ce9eb2-8a0c-4215-86f1-4e012455caa1/volumes" Dec 04 12:41:11 crc kubenswrapper[4760]: I1204 12:41:11.883689 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4f058e-0086-4779-8434-9de8045621fe" path="/var/lib/kubelet/pods/3d4f058e-0086-4779-8434-9de8045621fe/volumes" Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.046849 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.576521 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.641147 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d","Type":"ContainerStarted","Data":"c931938dac13be0fa45698554aa090c6ae3975edea9d1b21e32568e3650c738c"} Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.667651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7631823-c503-49fd-85ac-ec4b8bc18a5b","Type":"ContainerStarted","Data":"00b0c362ad7a6be02dc42e0427304b96206a88d896c17d14e5be608492cd9db2"} Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.667925 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7631823-c503-49fd-85ac-ec4b8bc18a5b","Type":"ContainerStarted","Data":"91a45c0117aa25a384f77da4d77b641c1ab23a32de1973e0d9400cbe54368add"} Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.667936 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7631823-c503-49fd-85ac-ec4b8bc18a5b","Type":"ContainerStarted","Data":"4500aa3e3ad2ab50221b0a6299df58debd13f94e1e9ef5979a2da2a5319bd5e8"} Dec 04 12:41:12 crc kubenswrapper[4760]: I1204 12:41:12.689739 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.689705447 podStartE2EDuration="2.689705447s" podCreationTimestamp="2025-12-04 12:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:41:12.688746697 +0000 UTC m=+1675.730193274" watchObservedRunningTime="2025-12-04 12:41:12.689705447 +0000 UTC m=+1675.731152014" Dec 04 12:41:13 crc kubenswrapper[4760]: I1204 12:41:13.685915 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d","Type":"ContainerStarted","Data":"7ac46905be3d36490e4ff902ab98499ac3561616b240f8cd03fb129cf513dba0"} Dec 04 12:41:13 crc kubenswrapper[4760]: I1204 12:41:13.686290 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3f4c620c-1c1a-4aa3-aa92-5df1a205e70d","Type":"ContainerStarted","Data":"9a44bc3e7332ea6020c64914b1f447dab1d39ae5020f67b88e052c2020ff7ff5"} Dec 04 12:41:13 crc kubenswrapper[4760]: I1204 12:41:13.713574 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.713533426 podStartE2EDuration="2.713533426s" podCreationTimestamp="2025-12-04 12:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:41:13.710396047 +0000 UTC m=+1676.751842624" watchObservedRunningTime="2025-12-04 12:41:13.713533426 +0000 UTC m=+1676.754979993" Dec 04 12:41:13 crc kubenswrapper[4760]: I1204 12:41:13.990885 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:14 crc kubenswrapper[4760]: I1204 12:41:14.056930 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:41:14 crc kubenswrapper[4760]: I1204 12:41:14.696770 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6cgxg" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="registry-server" containerID="cri-o://6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3" gracePeriod=2 Dec 04 12:41:14 crc kubenswrapper[4760]: I1204 12:41:14.865152 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:14 crc kubenswrapper[4760]: E1204 12:41:14.865603 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.120962 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.405230 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.448084 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgqt6\" (UniqueName: \"kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6\") pod \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.448161 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content\") pod \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.448470 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities\") pod \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\" (UID: \"5108bcea-b5b8-4a60-9a40-929ab1dfa845\") " Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.449743 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities" (OuterVolumeSpecName: "utilities") pod "5108bcea-b5b8-4a60-9a40-929ab1dfa845" (UID: "5108bcea-b5b8-4a60-9a40-929ab1dfa845"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.453947 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6" (OuterVolumeSpecName: "kube-api-access-hgqt6") pod "5108bcea-b5b8-4a60-9a40-929ab1dfa845" (UID: "5108bcea-b5b8-4a60-9a40-929ab1dfa845"). InnerVolumeSpecName "kube-api-access-hgqt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.513151 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5108bcea-b5b8-4a60-9a40-929ab1dfa845" (UID: "5108bcea-b5b8-4a60-9a40-929ab1dfa845"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.558041 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.558108 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgqt6\" (UniqueName: \"kubernetes.io/projected/5108bcea-b5b8-4a60-9a40-929ab1dfa845-kube-api-access-hgqt6\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.558126 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5108bcea-b5b8-4a60-9a40-929ab1dfa845-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.710156 4760 generic.go:334] "Generic (PLEG): container finished" podID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerID="6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3" exitCode=0 Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.710253 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerDied","Data":"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3"} Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.710301 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgxg" event={"ID":"5108bcea-b5b8-4a60-9a40-929ab1dfa845","Type":"ContainerDied","Data":"0e1a9a88bbe82fd0670a4f427248688b4e2561809cfefc2ff63c45a6070273d8"} Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.710326 4760 scope.go:117] "RemoveContainer" containerID="6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.710372 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgxg" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.734849 4760 scope.go:117] "RemoveContainer" containerID="10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.751359 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.765737 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6cgxg"] Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.794488 4760 scope.go:117] "RemoveContainer" containerID="e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.823682 4760 scope.go:117] "RemoveContainer" containerID="6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3" Dec 04 12:41:15 crc kubenswrapper[4760]: E1204 12:41:15.824366 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3\": container with ID starting with 6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3 not found: ID does not exist" containerID="6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.824410 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3"} err="failed to get container status \"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3\": rpc error: code = NotFound desc = could not find container \"6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3\": container with ID starting with 6822d9e5cc39adcb8b4ae9d48887b2b8b1159006354433b68856177d681bbdd3 not found: ID does not exist" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.824435 4760 scope.go:117] "RemoveContainer" containerID="10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450" Dec 04 12:41:15 crc kubenswrapper[4760]: E1204 12:41:15.824821 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450\": container with ID starting with 10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450 not found: ID does not exist" containerID="10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.824859 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450"} err="failed to get container status \"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450\": rpc error: code = NotFound desc = could not find container \"10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450\": container with ID starting with 10aa4644afc48d43948bdd8fb350b66694b1598ece99d13f963ffad4b9bdc450 not found: ID does not exist" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.824884 4760 scope.go:117] "RemoveContainer" containerID="e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e" Dec 04 12:41:15 crc kubenswrapper[4760]: E1204 12:41:15.825107 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e\": container with ID starting with e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e not found: ID does not exist" containerID="e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.825130 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e"} err="failed to get container status \"e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e\": rpc error: code = NotFound desc = could not find container \"e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e\": container with ID starting with e2b77d2071451caf640340aca3ed73eb062df87ae5a826b21f679ee581f1bb5e not found: ID does not exist" Dec 04 12:41:15 crc kubenswrapper[4760]: I1204 12:41:15.879969 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" path="/var/lib/kubelet/pods/5108bcea-b5b8-4a60-9a40-929ab1dfa845/volumes" Dec 04 12:41:16 crc kubenswrapper[4760]: I1204 12:41:16.088323 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:41:16 crc kubenswrapper[4760]: I1204 12:41:16.088395 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 12:41:20 crc kubenswrapper[4760]: I1204 12:41:20.117269 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 12:41:20 crc kubenswrapper[4760]: I1204 12:41:20.157468 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 12:41:20 crc kubenswrapper[4760]: I1204 12:41:20.407769 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 12:41:20 crc kubenswrapper[4760]: I1204 12:41:20.816251 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 12:41:21 crc kubenswrapper[4760]: I1204 12:41:21.088981 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 12:41:21 crc kubenswrapper[4760]: I1204 12:41:21.089049 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 12:41:22 crc kubenswrapper[4760]: I1204 12:41:22.048194 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:41:22 crc kubenswrapper[4760]: I1204 12:41:22.049112 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 12:41:22 crc kubenswrapper[4760]: I1204 12:41:22.105525 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d7631823-c503-49fd-85ac-ec4b8bc18a5b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:22 crc kubenswrapper[4760]: I1204 12:41:22.105539 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d7631823-c503-49fd-85ac-ec4b8bc18a5b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:23 crc kubenswrapper[4760]: I1204 12:41:23.062608 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f4c620c-1c1a-4aa3-aa92-5df1a205e70d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:23 crc kubenswrapper[4760]: I1204 12:41:23.062675 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3f4c620c-1c1a-4aa3-aa92-5df1a205e70d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 12:41:25 crc kubenswrapper[4760]: I1204 12:41:25.865019 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:25 crc kubenswrapper[4760]: E1204 12:41:25.865609 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.759891 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.893202 4760 generic.go:334] "Generic (PLEG): container finished" podID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerID="f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928" exitCode=137 Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.893289 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerDied","Data":"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928"} Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.893336 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f8eee46-6a6d-4385-abac-8230ecef5165","Type":"ContainerDied","Data":"cc9c955e02b743b1a7ce3fa1b212f049c713034a2965c5504fc8916542bf0565"} Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.893332 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.893362 4760 scope.go:117] "RemoveContainer" containerID="f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.926975 4760 scope.go:117] "RemoveContainer" containerID="3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.949403 4760 scope.go:117] "RemoveContainer" containerID="9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950403 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950568 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84s42\" (UniqueName: \"kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950670 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950716 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950796 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950862 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.950945 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.951025 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data\") pod \"3f8eee46-6a6d-4385-abac-8230ecef5165\" (UID: \"3f8eee46-6a6d-4385-abac-8230ecef5165\") " Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.951177 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.952590 4760 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.952752 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.968479 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts" (OuterVolumeSpecName: "scripts") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.970860 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42" (OuterVolumeSpecName: "kube-api-access-84s42") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "kube-api-access-84s42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.976719 4760 scope.go:117] "RemoveContainer" containerID="e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7" Dec 04 12:41:26 crc kubenswrapper[4760]: I1204 12:41:26.993526 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.021248 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.051148 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055019 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84s42\" (UniqueName: \"kubernetes.io/projected/3f8eee46-6a6d-4385-abac-8230ecef5165-kube-api-access-84s42\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055048 4760 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f8eee46-6a6d-4385-abac-8230ecef5165-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055058 4760 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055069 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055078 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.055088 4760 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.075401 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data" (OuterVolumeSpecName: "config-data") pod "3f8eee46-6a6d-4385-abac-8230ecef5165" (UID: "3f8eee46-6a6d-4385-abac-8230ecef5165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.096340 4760 scope.go:117] "RemoveContainer" containerID="f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.096892 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928\": container with ID starting with f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928 not found: ID does not exist" containerID="f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.096926 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928"} err="failed to get container status \"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928\": rpc error: code = NotFound desc = could not find container \"f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928\": container with ID starting with f5e2fb11c91072f241c6792fcf985edfe967f30d1db6e93e25f62fd243eda928 not found: ID does not exist" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.096953 4760 scope.go:117] "RemoveContainer" containerID="3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.097187 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b\": container with ID starting with 3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b not found: ID does not exist" containerID="3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.097226 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b"} err="failed to get container status \"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b\": rpc error: code = NotFound desc = could not find container \"3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b\": container with ID starting with 3fa6c3145a7d8e08dc7aff881b1e6e87906cf8cf8fac1008073884822456498b not found: ID does not exist" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.097243 4760 scope.go:117] "RemoveContainer" containerID="9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.097444 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9\": container with ID starting with 9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9 not found: ID does not exist" containerID="9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.097464 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9"} err="failed to get container status \"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9\": rpc error: code = NotFound desc = could not find container \"9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9\": container with ID starting with 9085239f263641967e907c9408a2facbc9eafeb4b673951b77241ef122f40ba9 not found: ID does not exist" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.097479 4760 scope.go:117] "RemoveContainer" containerID="e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.097660 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7\": container with ID starting with e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7 not found: ID does not exist" containerID="e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.097681 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7"} err="failed to get container status \"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7\": rpc error: code = NotFound desc = could not find container \"e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7\": container with ID starting with e9f0ceb7ca2e7402dab260eead7d933c0b231682925fe15412a54fd55b14c0d7 not found: ID does not exist" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.156012 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8eee46-6a6d-4385-abac-8230ecef5165-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.234529 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.246151 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268140 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268755 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="sg-core" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268775 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="sg-core" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268801 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-notification-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268809 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-notification-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268826 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="extract-utilities" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268834 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="extract-utilities" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268858 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="extract-content" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268865 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="extract-content" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268879 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-central-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268887 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-central-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.268905 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="registry-server" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.268913 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="registry-server" Dec 04 12:41:27 crc kubenswrapper[4760]: E1204 12:41:27.269158 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="proxy-httpd" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.269177 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="proxy-httpd" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.271558 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5108bcea-b5b8-4a60-9a40-929ab1dfa845" containerName="registry-server" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.271599 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-notification-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.271634 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="proxy-httpd" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.271644 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="sg-core" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.271658 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" containerName="ceilometer-central-agent" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.274296 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.277983 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.278245 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.278904 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.287804 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.462085 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-log-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.462884 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463119 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-config-data\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463250 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vdh\" (UniqueName: \"kubernetes.io/projected/b898fcbf-2997-40cf-b167-8875a2763092-kube-api-access-c8vdh\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463407 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-run-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463583 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.463725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-scripts\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.564801 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-run-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.564925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.564987 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-scripts\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565083 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565110 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-log-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565141 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565172 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-config-data\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565226 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vdh\" (UniqueName: \"kubernetes.io/projected/b898fcbf-2997-40cf-b167-8875a2763092-kube-api-access-c8vdh\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565370 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-run-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.565639 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b898fcbf-2997-40cf-b167-8875a2763092-log-httpd\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.571411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.571549 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.573396 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.579914 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-config-data\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.584697 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b898fcbf-2997-40cf-b167-8875a2763092-scripts\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.587528 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vdh\" (UniqueName: \"kubernetes.io/projected/b898fcbf-2997-40cf-b167-8875a2763092-kube-api-access-c8vdh\") pod \"ceilometer-0\" (UID: \"b898fcbf-2997-40cf-b167-8875a2763092\") " pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.601797 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 12:41:27 crc kubenswrapper[4760]: I1204 12:41:27.889125 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f8eee46-6a6d-4385-abac-8230ecef5165" path="/var/lib/kubelet/pods/3f8eee46-6a6d-4385-abac-8230ecef5165/volumes" Dec 04 12:41:28 crc kubenswrapper[4760]: W1204 12:41:28.170514 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb898fcbf_2997_40cf_b167_8875a2763092.slice/crio-e738e47bfc39417fcd1654af15decf11f522509715c119a8e4486418da6b1760 WatchSource:0}: Error finding container e738e47bfc39417fcd1654af15decf11f522509715c119a8e4486418da6b1760: Status 404 returned error can't find the container with id e738e47bfc39417fcd1654af15decf11f522509715c119a8e4486418da6b1760 Dec 04 12:41:28 crc kubenswrapper[4760]: I1204 12:41:28.190299 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 12:41:28 crc kubenswrapper[4760]: I1204 12:41:28.928480 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b898fcbf-2997-40cf-b167-8875a2763092","Type":"ContainerStarted","Data":"e738e47bfc39417fcd1654af15decf11f522509715c119a8e4486418da6b1760"} Dec 04 12:41:29 crc kubenswrapper[4760]: I1204 12:41:29.943725 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b898fcbf-2997-40cf-b167-8875a2763092","Type":"ContainerStarted","Data":"ea8511469580faeee42c8298f2d64078c1c58b770389da2deb7925b0b6046f86"} Dec 04 12:41:29 crc kubenswrapper[4760]: I1204 12:41:29.944150 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b898fcbf-2997-40cf-b167-8875a2763092","Type":"ContainerStarted","Data":"ff1d7678f8729aa50254a94a59ead0a7182a9445f5b76f9666d6202773b2ad20"} Dec 04 12:41:30 crc kubenswrapper[4760]: I1204 12:41:30.962183 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b898fcbf-2997-40cf-b167-8875a2763092","Type":"ContainerStarted","Data":"13ccf9406818b3df8c8cca9ac1531d91ae772ffa1a983691d4a8a6bb71277f91"} Dec 04 12:41:31 crc kubenswrapper[4760]: I1204 12:41:31.099247 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 12:41:31 crc kubenswrapper[4760]: I1204 12:41:31.104089 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 12:41:31 crc kubenswrapper[4760]: I1204 12:41:31.110441 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 12:41:31 crc kubenswrapper[4760]: I1204 12:41:31.993269 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 12:41:32 crc kubenswrapper[4760]: I1204 12:41:32.063645 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 12:41:32 crc kubenswrapper[4760]: I1204 12:41:32.066247 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 12:41:32 crc kubenswrapper[4760]: I1204 12:41:32.086033 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 12:41:32 crc kubenswrapper[4760]: I1204 12:41:32.092960 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 12:41:32 crc kubenswrapper[4760]: I1204 12:41:32.998646 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b898fcbf-2997-40cf-b167-8875a2763092","Type":"ContainerStarted","Data":"1eb981942dc0fe7c8c2454cfe985e6c4b553b2dc80c9a3f62b656f1eebac04e6"} Dec 04 12:41:33 crc kubenswrapper[4760]: I1204 12:41:32.999731 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 12:41:33 crc kubenswrapper[4760]: I1204 12:41:33.019588 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 12:41:33 crc kubenswrapper[4760]: I1204 12:41:33.034364 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.367834782 podStartE2EDuration="6.034335208s" podCreationTimestamp="2025-12-04 12:41:27 +0000 UTC" firstStartedPulling="2025-12-04 12:41:28.173733968 +0000 UTC m=+1691.215180535" lastFinishedPulling="2025-12-04 12:41:31.840234394 +0000 UTC m=+1694.881680961" observedRunningTime="2025-12-04 12:41:33.026731437 +0000 UTC m=+1696.068178004" watchObservedRunningTime="2025-12-04 12:41:33.034335208 +0000 UTC m=+1696.075781765" Dec 04 12:41:34 crc kubenswrapper[4760]: I1204 12:41:34.010709 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 12:41:36 crc kubenswrapper[4760]: I1204 12:41:36.864347 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:36 crc kubenswrapper[4760]: E1204 12:41:36.864887 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:41:47 crc kubenswrapper[4760]: I1204 12:41:47.886985 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:47 crc kubenswrapper[4760]: E1204 12:41:47.887804 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:41:57 crc kubenswrapper[4760]: I1204 12:41:57.072463 4760 scope.go:117] "RemoveContainer" containerID="852c2641d49371e30153bafa2185d27d17baaf3c934ac2bc2f501a409ec34586" Dec 04 12:41:57 crc kubenswrapper[4760]: I1204 12:41:57.611062 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 12:41:59 crc kubenswrapper[4760]: I1204 12:41:59.865419 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:41:59 crc kubenswrapper[4760]: E1204 12:41:59.866306 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.094102 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.097727 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.110429 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.228503 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.228573 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.229030 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxhlx\" (UniqueName: \"kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.330820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxhlx\" (UniqueName: \"kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.330951 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.330980 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.331613 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.331619 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.351680 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxhlx\" (UniqueName: \"kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx\") pod \"redhat-marketplace-rzgb6\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.422809 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:04 crc kubenswrapper[4760]: I1204 12:42:04.989469 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:05 crc kubenswrapper[4760]: I1204 12:42:05.824152 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26d6eca-6891-4544-848a-4d34fd081522" containerID="99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833" exitCode=0 Dec 04 12:42:05 crc kubenswrapper[4760]: I1204 12:42:05.824642 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerDied","Data":"99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833"} Dec 04 12:42:05 crc kubenswrapper[4760]: I1204 12:42:05.824677 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerStarted","Data":"8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03"} Dec 04 12:42:07 crc kubenswrapper[4760]: I1204 12:42:07.724819 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:07 crc kubenswrapper[4760]: I1204 12:42:07.881929 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26d6eca-6891-4544-848a-4d34fd081522" containerID="f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174" exitCode=0 Dec 04 12:42:07 crc kubenswrapper[4760]: I1204 12:42:07.885127 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerDied","Data":"f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174"} Dec 04 12:42:08 crc kubenswrapper[4760]: I1204 12:42:08.845671 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:09 crc kubenswrapper[4760]: I1204 12:42:09.928454 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerStarted","Data":"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b"} Dec 04 12:42:09 crc kubenswrapper[4760]: I1204 12:42:09.978057 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rzgb6" podStartSLOduration=3.189776629 podStartE2EDuration="5.978020086s" podCreationTimestamp="2025-12-04 12:42:04 +0000 UTC" firstStartedPulling="2025-12-04 12:42:05.828431775 +0000 UTC m=+1728.869878342" lastFinishedPulling="2025-12-04 12:42:08.616675222 +0000 UTC m=+1731.658121799" observedRunningTime="2025-12-04 12:42:09.967298666 +0000 UTC m=+1733.008745233" watchObservedRunningTime="2025-12-04 12:42:09.978020086 +0000 UTC m=+1733.019466653" Dec 04 12:42:11 crc kubenswrapper[4760]: I1204 12:42:11.865228 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:42:11 crc kubenswrapper[4760]: E1204 12:42:11.866437 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:42:13 crc kubenswrapper[4760]: I1204 12:42:13.906252 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" containerID="cri-o://34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9" gracePeriod=604794 Dec 04 12:42:14 crc kubenswrapper[4760]: I1204 12:42:14.585857 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:14 crc kubenswrapper[4760]: I1204 12:42:14.587522 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:14 crc kubenswrapper[4760]: I1204 12:42:14.680307 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:15 crc kubenswrapper[4760]: I1204 12:42:15.062580 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:15 crc kubenswrapper[4760]: I1204 12:42:15.147945 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:15 crc kubenswrapper[4760]: I1204 12:42:15.236429 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" containerID="cri-o://0b5af118690eb894d7059f40d968e970ae6c204dc44afb15b8d6ec0656baa2ce" gracePeriod=604794 Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.023607 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rzgb6" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="registry-server" containerID="cri-o://d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b" gracePeriod=2 Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.067415 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.623934 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.766341 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities\") pod \"d26d6eca-6891-4544-848a-4d34fd081522\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.766509 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxhlx\" (UniqueName: \"kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx\") pod \"d26d6eca-6891-4544-848a-4d34fd081522\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.766900 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content\") pod \"d26d6eca-6891-4544-848a-4d34fd081522\" (UID: \"d26d6eca-6891-4544-848a-4d34fd081522\") " Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.767387 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities" (OuterVolumeSpecName: "utilities") pod "d26d6eca-6891-4544-848a-4d34fd081522" (UID: "d26d6eca-6891-4544-848a-4d34fd081522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.773510 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx" (OuterVolumeSpecName: "kube-api-access-xxhlx") pod "d26d6eca-6891-4544-848a-4d34fd081522" (UID: "d26d6eca-6891-4544-848a-4d34fd081522"). InnerVolumeSpecName "kube-api-access-xxhlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.790534 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d26d6eca-6891-4544-848a-4d34fd081522" (UID: "d26d6eca-6891-4544-848a-4d34fd081522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.869667 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.869719 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26d6eca-6891-4544-848a-4d34fd081522-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:17 crc kubenswrapper[4760]: I1204 12:42:17.869731 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxhlx\" (UniqueName: \"kubernetes.io/projected/d26d6eca-6891-4544-848a-4d34fd081522-kube-api-access-xxhlx\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.000381 4760 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.037919 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26d6eca-6891-4544-848a-4d34fd081522" containerID="d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b" exitCode=0 Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.037977 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerDied","Data":"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b"} Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.037987 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzgb6" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.038012 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzgb6" event={"ID":"d26d6eca-6891-4544-848a-4d34fd081522","Type":"ContainerDied","Data":"8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03"} Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.038049 4760 scope.go:117] "RemoveContainer" containerID="d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.066059 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.075469 4760 scope.go:117] "RemoveContainer" containerID="f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.077093 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzgb6"] Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.099400 4760 scope.go:117] "RemoveContainer" containerID="99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.181158 4760 scope.go:117] "RemoveContainer" containerID="d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b" Dec 04 12:42:18 crc kubenswrapper[4760]: E1204 12:42:18.182763 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b\": container with ID starting with d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b not found: ID does not exist" containerID="d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.182872 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b"} err="failed to get container status \"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b\": rpc error: code = NotFound desc = could not find container \"d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b\": container with ID starting with d928d20ef50a4245532346e3630219a2740a34f32ad2d586d447dd253cd6959b not found: ID does not exist" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.182967 4760 scope.go:117] "RemoveContainer" containerID="f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174" Dec 04 12:42:18 crc kubenswrapper[4760]: E1204 12:42:18.183997 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174\": container with ID starting with f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174 not found: ID does not exist" containerID="f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.184055 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174"} err="failed to get container status \"f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174\": rpc error: code = NotFound desc = could not find container \"f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174\": container with ID starting with f2b5c6ccc4df4622f94a138a21b22ef8a47b20727d005dbe4f0b0361c6ee7174 not found: ID does not exist" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.184093 4760 scope.go:117] "RemoveContainer" containerID="99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833" Dec 04 12:42:18 crc kubenswrapper[4760]: E1204 12:42:18.184649 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833\": container with ID starting with 99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833 not found: ID does not exist" containerID="99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833" Dec 04 12:42:18 crc kubenswrapper[4760]: I1204 12:42:18.184712 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833"} err="failed to get container status \"99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833\": rpc error: code = NotFound desc = could not find container \"99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833\": container with ID starting with 99eef71e7e635b8c35dcd42cc4a477f9146c408d2463d7b6e15ee87298afd833 not found: ID does not exist" Dec 04 12:42:19 crc kubenswrapper[4760]: I1204 12:42:19.876574 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26d6eca-6891-4544-848a-4d34fd081522" path="/var/lib/kubelet/pods/d26d6eca-6891-4544-848a-4d34fd081522/volumes" Dec 04 12:42:20 crc kubenswrapper[4760]: E1204 12:42:20.307639 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6477a59_2dc3_4fff_907e_7e927cf257d3.slice/crio-34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6477a59_2dc3_4fff_907e_7e927cf257d3.slice/crio-conmon-34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9.scope\": RecentStats: unable to find data in memory cache]" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.524770 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.540753 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.540930 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tct7s\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541006 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541057 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541085 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541144 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541244 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541301 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541342 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541475 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.541526 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret\") pod \"b6477a59-2dc3-4fff-907e-7e927cf257d3\" (UID: \"b6477a59-2dc3-4fff-907e-7e927cf257d3\") " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.542854 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.542911 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.545470 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.547653 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.548537 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.551554 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s" (OuterVolumeSpecName: "kube-api-access-tct7s") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "kube-api-access-tct7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.557324 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.566527 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info" (OuterVolumeSpecName: "pod-info") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.596264 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data" (OuterVolumeSpecName: "config-data") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.647128 4760 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6477a59-2dc3-4fff-907e-7e927cf257d3-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.647964 4760 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6477a59-2dc3-4fff-907e-7e927cf257d3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648139 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648266 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tct7s\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-kube-api-access-tct7s\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648370 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648476 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648609 4760 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648705 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.648822 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.673341 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf" (OuterVolumeSpecName: "server-conf") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.690031 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.755124 4760 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6477a59-2dc3-4fff-907e-7e927cf257d3-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.755171 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.778842 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b6477a59-2dc3-4fff-907e-7e927cf257d3" (UID: "b6477a59-2dc3-4fff-907e-7e927cf257d3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:20 crc kubenswrapper[4760]: I1204 12:42:20.856872 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6477a59-2dc3-4fff-907e-7e927cf257d3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.081161 4760 generic.go:334] "Generic (PLEG): container finished" podID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerID="34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9" exitCode=0 Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.081356 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerDied","Data":"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9"} Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.081456 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.081548 4760 scope.go:117] "RemoveContainer" containerID="34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.081522 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b6477a59-2dc3-4fff-907e-7e927cf257d3","Type":"ContainerDied","Data":"3c3057d062e84cc3725c3c925330cfab62d4853e7a1ee840efbbb3e84b1f8e11"} Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.114947 4760 scope.go:117] "RemoveContainer" containerID="59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.177722 4760 scope.go:117] "RemoveContainer" containerID="34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9" Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.186168 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9\": container with ID starting with 34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9 not found: ID does not exist" containerID="34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.211307 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9"} err="failed to get container status \"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9\": rpc error: code = NotFound desc = could not find container \"34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9\": container with ID starting with 34f8295b432614e3d20aeebbe1e88c0439159286c946fdeadbe80ab6c16ceab9 not found: ID does not exist" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.211372 4760 scope.go:117] "RemoveContainer" containerID="59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.204018 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.229022 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc\": container with ID starting with 59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc not found: ID does not exist" containerID="59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.229092 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc"} err="failed to get container status \"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc\": rpc error: code = NotFound desc = could not find container \"59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc\": container with ID starting with 59cc72f4084936a5e58e3a5868d19acae5c45f22dd33a87049a3a0e9c01b1bfc not found: ID does not exist" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.267832 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.279199 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.281761 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="extract-content" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.281799 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="extract-content" Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.281867 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="extract-utilities" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.281875 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="extract-utilities" Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.281893 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="registry-server" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.281901 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="registry-server" Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.281935 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="setup-container" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.281942 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="setup-container" Dec 04 12:42:21 crc kubenswrapper[4760]: E1204 12:42:21.281953 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.281960 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.282451 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" containerName="rabbitmq" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.282484 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26d6eca-6891-4544-848a-4d34fd081522" containerName="registry-server" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.284410 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.296458 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.296570 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.296900 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.297275 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.297660 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-h6ktn" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.297871 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.298089 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.348613 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380128 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380400 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50565ce8-ee16-43b2-af07-c92e7444546c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380471 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r99gj\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-kube-api-access-r99gj\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380558 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380588 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50565ce8-ee16-43b2-af07-c92e7444546c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380785 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380919 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.380955 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-config-data\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.381014 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.381077 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575398 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575445 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575596 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50565ce8-ee16-43b2-af07-c92e7444546c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575624 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r99gj\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-kube-api-access-r99gj\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575670 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50565ce8-ee16-43b2-af07-c92e7444546c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575846 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.575966 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.576011 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-config-data\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.576073 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.576158 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.576463 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.577800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.577924 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.578191 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.578423 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.581686 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50565ce8-ee16-43b2-af07-c92e7444546c-config-data\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.584607 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50565ce8-ee16-43b2-af07-c92e7444546c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.585800 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.590025 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.603734 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50565ce8-ee16-43b2-af07-c92e7444546c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.606789 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r99gj\" (UniqueName: \"kubernetes.io/projected/50565ce8-ee16-43b2-af07-c92e7444546c-kube-api-access-r99gj\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.661936 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"50565ce8-ee16-43b2-af07-c92e7444546c\") " pod="openstack/rabbitmq-server-0" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.883891 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6477a59-2dc3-4fff-907e-7e927cf257d3" path="/var/lib/kubelet/pods/b6477a59-2dc3-4fff-907e-7e927cf257d3/volumes" Dec 04 12:42:21 crc kubenswrapper[4760]: I1204 12:42:21.951451 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.145347 4760 generic.go:334] "Generic (PLEG): container finished" podID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerID="0b5af118690eb894d7059f40d968e970ae6c204dc44afb15b8d6ec0656baa2ce" exitCode=0 Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.145392 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerDied","Data":"0b5af118690eb894d7059f40d968e970ae6c204dc44afb15b8d6ec0656baa2ce"} Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.277289 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.419689 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.419788 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.419836 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.419966 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420019 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420185 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420521 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420587 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420646 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg7pk\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420689 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.420726 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\" (UID: \"62ceba36-f8bc-4644-978c-08a4cbf88ae5\") " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.421833 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.422830 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.423472 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.427550 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.433237 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.433225 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info" (OuterVolumeSpecName: "pod-info") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.433488 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.437146 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk" (OuterVolumeSpecName: "kube-api-access-lg7pk") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "kube-api-access-lg7pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.489826 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data" (OuterVolumeSpecName: "config-data") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.499549 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf" (OuterVolumeSpecName: "server-conf") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523512 4760 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523564 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523578 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg7pk\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-kube-api-access-lg7pk\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523590 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523633 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523648 4760 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62ceba36-f8bc-4644-978c-08a4cbf88ae5-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.523661 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.525087 4760 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62ceba36-f8bc-4644-978c-08a4cbf88ae5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.525156 4760 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62ceba36-f8bc-4644-978c-08a4cbf88ae5-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.525175 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.553021 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 12:42:22 crc kubenswrapper[4760]: W1204 12:42:22.557957 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50565ce8_ee16_43b2_af07_c92e7444546c.slice/crio-8b7da60a801702a943d722d95299c749646d91e423c4a8fce532f1a31ab17f23 WatchSource:0}: Error finding container 8b7da60a801702a943d722d95299c749646d91e423c4a8fce532f1a31ab17f23: Status 404 returned error can't find the container with id 8b7da60a801702a943d722d95299c749646d91e423c4a8fce532f1a31ab17f23 Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.568919 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.627501 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.629415 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "62ceba36-f8bc-4644-978c-08a4cbf88ae5" (UID: "62ceba36-f8bc-4644-978c-08a4cbf88ae5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:22 crc kubenswrapper[4760]: I1204 12:42:22.730536 4760 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62ceba36-f8bc-4644-978c-08a4cbf88ae5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.164714 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50565ce8-ee16-43b2-af07-c92e7444546c","Type":"ContainerStarted","Data":"8b7da60a801702a943d722d95299c749646d91e423c4a8fce532f1a31ab17f23"} Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.167297 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62ceba36-f8bc-4644-978c-08a4cbf88ae5","Type":"ContainerDied","Data":"31de9a10947fd0eaa552656a9f4718ca456b088f24a766dc1ed1fc31c5baa428"} Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.167343 4760 scope.go:117] "RemoveContainer" containerID="0b5af118690eb894d7059f40d968e970ae6c204dc44afb15b8d6ec0656baa2ce" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.167503 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.221986 4760 scope.go:117] "RemoveContainer" containerID="c643d91bc9dc7bde5a3c757a5b02bc7fd000256f842eb4fe2df55b32bc742ec2" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.242204 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.259970 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.270468 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:23 crc kubenswrapper[4760]: E1204 12:42:23.270981 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.270996 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" Dec 04 12:42:23 crc kubenswrapper[4760]: E1204 12:42:23.271043 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="setup-container" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.271048 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="setup-container" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.271301 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" containerName="rabbitmq" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.273013 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.276538 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.276772 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.276960 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.277105 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.278164 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.278374 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dbftv" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.278542 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.308724 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678485 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678579 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678607 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678646 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678667 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdjk\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-kube-api-access-lkdjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678698 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/482cdb14-c28c-44e0-8054-a5e782a71b54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678758 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678823 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.678878 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/482cdb14-c28c-44e0-8054-a5e782a71b54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781031 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781144 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781168 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdjk\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-kube-api-access-lkdjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781200 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781250 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/482cdb14-c28c-44e0-8054-a5e782a71b54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781310 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781332 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781388 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781460 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/482cdb14-c28c-44e0-8054-a5e782a71b54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781563 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.781692 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.782119 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.782182 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.782558 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.782601 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.782744 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/482cdb14-c28c-44e0-8054-a5e782a71b54-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.787168 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.789302 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.791718 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/482cdb14-c28c-44e0-8054-a5e782a71b54-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.793602 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/482cdb14-c28c-44e0-8054-a5e782a71b54-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.809488 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdjk\" (UniqueName: \"kubernetes.io/projected/482cdb14-c28c-44e0-8054-a5e782a71b54-kube-api-access-lkdjk\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.855078 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"482cdb14-c28c-44e0-8054-a5e782a71b54\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.881266 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ceba36-f8bc-4644-978c-08a4cbf88ae5" path="/var/lib/kubelet/pods/62ceba36-f8bc-4644-978c-08a4cbf88ae5/volumes" Dec 04 12:42:23 crc kubenswrapper[4760]: I1204 12:42:23.936463 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.366478 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.369350 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.374507 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.378722 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.445776 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.504524 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.504657 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.504725 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.504863 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tnq\" (UniqueName: \"kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.504981 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.505151 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.505264 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.606938 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607081 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607162 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607245 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607272 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.607317 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tnq\" (UniqueName: \"kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.608517 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.608554 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.608920 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.610621 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.617452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.617487 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.647115 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tnq\" (UniqueName: \"kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq\") pod \"dnsmasq-dns-5559d4f67f-ngqjv\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.696589 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:24 crc kubenswrapper[4760]: I1204 12:42:24.866865 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:42:24 crc kubenswrapper[4760]: E1204 12:42:24.867549 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:42:25 crc kubenswrapper[4760]: I1204 12:42:25.204717 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50565ce8-ee16-43b2-af07-c92e7444546c","Type":"ContainerStarted","Data":"5a22e5f5a70eaaad5c00116fcaf4030c3c49b6dc577e51c419fb232d836d2989"} Dec 04 12:42:25 crc kubenswrapper[4760]: I1204 12:42:25.208124 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"482cdb14-c28c-44e0-8054-a5e782a71b54","Type":"ContainerStarted","Data":"10fe0333a81fd9a8e1700f0a5cd5023e5aaefad56f9f588f2c190180cd841ac8"} Dec 04 12:42:25 crc kubenswrapper[4760]: I1204 12:42:25.259168 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:26 crc kubenswrapper[4760]: I1204 12:42:26.219796 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"482cdb14-c28c-44e0-8054-a5e782a71b54","Type":"ContainerStarted","Data":"108da259851c1ba2ffd7d1dfa1b1d661da113d8aeb71040116b96ace30277484"} Dec 04 12:42:26 crc kubenswrapper[4760]: I1204 12:42:26.221441 4760 generic.go:334] "Generic (PLEG): container finished" podID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerID="9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1" exitCode=0 Dec 04 12:42:26 crc kubenswrapper[4760]: I1204 12:42:26.221502 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" event={"ID":"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750","Type":"ContainerDied","Data":"9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1"} Dec 04 12:42:26 crc kubenswrapper[4760]: I1204 12:42:26.221566 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" event={"ID":"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750","Type":"ContainerStarted","Data":"332469c9bf3b59437c437653753bcc149d63f919efe46ca5df199e7d23abb8f2"} Dec 04 12:42:27 crc kubenswrapper[4760]: I1204 12:42:27.233814 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" event={"ID":"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750","Type":"ContainerStarted","Data":"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d"} Dec 04 12:42:27 crc kubenswrapper[4760]: I1204 12:42:27.265972 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" podStartSLOduration=3.265934161 podStartE2EDuration="3.265934161s" podCreationTimestamp="2025-12-04 12:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:42:27.257130521 +0000 UTC m=+1750.298577088" watchObservedRunningTime="2025-12-04 12:42:27.265934161 +0000 UTC m=+1750.307380728" Dec 04 12:42:28 crc kubenswrapper[4760]: I1204 12:42:28.244140 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:30 crc kubenswrapper[4760]: E1204 12:42:30.577750 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache]" Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.699434 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.766001 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.766728 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="dnsmasq-dns" containerID="cri-o://3b534b3870c0455f46706b086c68798186dd666caa94582c32cfb3cba0feb8c0" gracePeriod=10 Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.964785 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-g4fqd"] Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.967675 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:34 crc kubenswrapper[4760]: I1204 12:42:34.977117 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-g4fqd"] Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.033910 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt9d8\" (UniqueName: \"kubernetes.io/projected/19e80d45-0318-4d8f-8567-e3aef4734081-kube-api-access-lt9d8\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.033992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.034077 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.034116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.034200 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.034323 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-config\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.034390 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136268 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt9d8\" (UniqueName: \"kubernetes.io/projected/19e80d45-0318-4d8f-8567-e3aef4734081-kube-api-access-lt9d8\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136345 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136421 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136453 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136511 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-config\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.136625 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.137560 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-sb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.138808 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-swift-storage-0\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.139135 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-config\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.139411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-ovsdbserver-nb\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.142048 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.144975 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e80d45-0318-4d8f-8567-e3aef4734081-dns-svc\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.164505 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt9d8\" (UniqueName: \"kubernetes.io/projected/19e80d45-0318-4d8f-8567-e3aef4734081-kube-api-access-lt9d8\") pod \"dnsmasq-dns-5d99fc9df9-g4fqd\" (UID: \"19e80d45-0318-4d8f-8567-e3aef4734081\") " pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.329775 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.342705 4760 generic.go:334] "Generic (PLEG): container finished" podID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerID="3b534b3870c0455f46706b086c68798186dd666caa94582c32cfb3cba0feb8c0" exitCode=0 Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.342769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" event={"ID":"33a2d963-2a05-4e38-b7ee-fd3114f137c2","Type":"ContainerDied","Data":"3b534b3870c0455f46706b086c68798186dd666caa94582c32cfb3cba0feb8c0"} Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.342813 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" event={"ID":"33a2d963-2a05-4e38-b7ee-fd3114f137c2","Type":"ContainerDied","Data":"535dfb3d262d5672862849da243230c9bc29b85367ecaa55f44a90ce65b20328"} Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.342829 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535dfb3d262d5672862849da243230c9bc29b85367ecaa55f44a90ce65b20328" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.457832 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.550886 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.551033 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.551131 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.551279 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.551393 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.551488 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztqbg\" (UniqueName: \"kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg\") pod \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\" (UID: \"33a2d963-2a05-4e38-b7ee-fd3114f137c2\") " Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.561784 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg" (OuterVolumeSpecName: "kube-api-access-ztqbg") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "kube-api-access-ztqbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.650903 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config" (OuterVolumeSpecName: "config") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.650923 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.663730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.676311 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.676349 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztqbg\" (UniqueName: \"kubernetes.io/projected/33a2d963-2a05-4e38-b7ee-fd3114f137c2-kube-api-access-ztqbg\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.676365 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.676374 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.678187 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.682263 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33a2d963-2a05-4e38-b7ee-fd3114f137c2" (UID: "33a2d963-2a05-4e38-b7ee-fd3114f137c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.778569 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.778642 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a2d963-2a05-4e38-b7ee-fd3114f137c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:35 crc kubenswrapper[4760]: W1204 12:42:35.888560 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19e80d45_0318_4d8f_8567_e3aef4734081.slice/crio-9a95b343560c3eed196dd366231288fa6d6c3d73b3d038ded9e5ae9a33a5769b WatchSource:0}: Error finding container 9a95b343560c3eed196dd366231288fa6d6c3d73b3d038ded9e5ae9a33a5769b: Status 404 returned error can't find the container with id 9a95b343560c3eed196dd366231288fa6d6c3d73b3d038ded9e5ae9a33a5769b Dec 04 12:42:35 crc kubenswrapper[4760]: I1204 12:42:35.902523 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d99fc9df9-g4fqd"] Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.355179 4760 generic.go:334] "Generic (PLEG): container finished" podID="19e80d45-0318-4d8f-8567-e3aef4734081" containerID="8f15fe48cc194cf2c514f33265d37b32f1acee3b082ed0d29770ed5ff95a804e" exitCode=0 Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.355632 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4c997d87-ttkjm" Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.356824 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" event={"ID":"19e80d45-0318-4d8f-8567-e3aef4734081","Type":"ContainerDied","Data":"8f15fe48cc194cf2c514f33265d37b32f1acee3b082ed0d29770ed5ff95a804e"} Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.356873 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" event={"ID":"19e80d45-0318-4d8f-8567-e3aef4734081","Type":"ContainerStarted","Data":"9a95b343560c3eed196dd366231288fa6d6c3d73b3d038ded9e5ae9a33a5769b"} Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.513136 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:42:36 crc kubenswrapper[4760]: I1204 12:42:36.525171 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b4c997d87-ttkjm"] Dec 04 12:42:37 crc kubenswrapper[4760]: I1204 12:42:37.369434 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" event={"ID":"19e80d45-0318-4d8f-8567-e3aef4734081","Type":"ContainerStarted","Data":"ab40a6fa3138b2b4aad881875882f88f115390d6ebbe653f165bc6581142ac9e"} Dec 04 12:42:37 crc kubenswrapper[4760]: I1204 12:42:37.369793 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:37 crc kubenswrapper[4760]: I1204 12:42:37.396525 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" podStartSLOduration=3.396472845 podStartE2EDuration="3.396472845s" podCreationTimestamp="2025-12-04 12:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:42:37.39064295 +0000 UTC m=+1760.432089517" watchObservedRunningTime="2025-12-04 12:42:37.396472845 +0000 UTC m=+1760.437919412" Dec 04 12:42:37 crc kubenswrapper[4760]: I1204 12:42:37.877671 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" path="/var/lib/kubelet/pods/33a2d963-2a05-4e38-b7ee-fd3114f137c2/volumes" Dec 04 12:42:38 crc kubenswrapper[4760]: I1204 12:42:38.864693 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:42:38 crc kubenswrapper[4760]: E1204 12:42:38.865462 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:42:40 crc kubenswrapper[4760]: E1204 12:42:40.847950 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:42:45 crc kubenswrapper[4760]: I1204 12:42:45.332169 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d99fc9df9-g4fqd" Dec 04 12:42:45 crc kubenswrapper[4760]: I1204 12:42:45.392887 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:45 crc kubenswrapper[4760]: I1204 12:42:45.393289 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="dnsmasq-dns" containerID="cri-o://9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d" gracePeriod=10 Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.091971 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.266581 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267579 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267719 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267753 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267811 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2tnq\" (UniqueName: \"kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267841 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.267983 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb\") pod \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\" (UID: \"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750\") " Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.274366 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq" (OuterVolumeSpecName: "kube-api-access-b2tnq") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "kube-api-access-b2tnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.329744 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.331395 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.334936 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config" (OuterVolumeSpecName: "config") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.335607 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.342011 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.343509 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" (UID: "75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369822 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369856 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369870 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2tnq\" (UniqueName: \"kubernetes.io/projected/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-kube-api-access-b2tnq\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369880 4760 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369889 4760 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369898 4760 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-config\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.369908 4760 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.463772 4760 generic.go:334] "Generic (PLEG): container finished" podID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerID="9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d" exitCode=0 Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.463840 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" event={"ID":"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750","Type":"ContainerDied","Data":"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d"} Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.463841 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.463883 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5559d4f67f-ngqjv" event={"ID":"75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750","Type":"ContainerDied","Data":"332469c9bf3b59437c437653753bcc149d63f919efe46ca5df199e7d23abb8f2"} Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.463903 4760 scope.go:117] "RemoveContainer" containerID="9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.487122 4760 scope.go:117] "RemoveContainer" containerID="9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.555001 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.558556 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5559d4f67f-ngqjv"] Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.579754 4760 scope.go:117] "RemoveContainer" containerID="9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d" Dec 04 12:42:46 crc kubenswrapper[4760]: E1204 12:42:46.580304 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d\": container with ID starting with 9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d not found: ID does not exist" containerID="9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.580353 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d"} err="failed to get container status \"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d\": rpc error: code = NotFound desc = could not find container \"9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d\": container with ID starting with 9117ffc079bfc65695a9d09acec2b2ed36fac281762f69477e0edb041e187e3d not found: ID does not exist" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.580383 4760 scope.go:117] "RemoveContainer" containerID="9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1" Dec 04 12:42:46 crc kubenswrapper[4760]: E1204 12:42:46.580617 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1\": container with ID starting with 9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1 not found: ID does not exist" containerID="9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1" Dec 04 12:42:46 crc kubenswrapper[4760]: I1204 12:42:46.580647 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1"} err="failed to get container status \"9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1\": rpc error: code = NotFound desc = could not find container \"9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1\": container with ID starting with 9e5c3080a6392f1a4b44631d3b74a6ef30c698a6bf10ee7e05c2e37bbeee2bd1 not found: ID does not exist" Dec 04 12:42:47 crc kubenswrapper[4760]: I1204 12:42:47.939322 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" path="/var/lib/kubelet/pods/75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750/volumes" Dec 04 12:42:51 crc kubenswrapper[4760]: E1204 12:42:51.123639 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache]" Dec 04 12:42:51 crc kubenswrapper[4760]: I1204 12:42:51.865681 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:42:51 crc kubenswrapper[4760]: E1204 12:42:51.866165 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.358202 4760 scope.go:117] "RemoveContainer" containerID="7d02f4faa6170b88d08cadd8e1179e5e036635971e3b0544bf1f05777d23a70c" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.380886 4760 scope.go:117] "RemoveContainer" containerID="cc4bb714b5e43dd8d68e6969cad10449ea5df26f89281f0554ab6248963395e2" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.401990 4760 scope.go:117] "RemoveContainer" containerID="0633f872cc5171c179e3df911714690342d38a84f061229b5c5387033035ac80" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.430793 4760 scope.go:117] "RemoveContainer" containerID="42eacc3e69784f66561eae6cd92fe685384906e8e756b0eb71d12cbc6f7e98ae" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.474851 4760 scope.go:117] "RemoveContainer" containerID="397a43da3613c2b526c626e5e6731e03259fc8d31bd1daac41d3f811bb234537" Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.575601 4760 generic.go:334] "Generic (PLEG): container finished" podID="50565ce8-ee16-43b2-af07-c92e7444546c" containerID="5a22e5f5a70eaaad5c00116fcaf4030c3c49b6dc577e51c419fb232d836d2989" exitCode=0 Dec 04 12:42:57 crc kubenswrapper[4760]: I1204 12:42:57.575648 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50565ce8-ee16-43b2-af07-c92e7444546c","Type":"ContainerDied","Data":"5a22e5f5a70eaaad5c00116fcaf4030c3c49b6dc577e51c419fb232d836d2989"} Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.582260 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx"] Dec 04 12:42:58 crc kubenswrapper[4760]: E1204 12:42:58.583140 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="init" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583159 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="init" Dec 04 12:42:58 crc kubenswrapper[4760]: E1204 12:42:58.583199 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583257 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: E1204 12:42:58.583282 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583290 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: E1204 12:42:58.583312 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="init" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583320 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="init" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583593 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f6ccd3-fb55-4e86-8b10-fb1c9ccf2750" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.583636 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a2d963-2a05-4e38-b7ee-fd3114f137c2" containerName="dnsmasq-dns" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.584614 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.592268 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.592971 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.598294 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.602795 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.606763 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx"] Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.614044 4760 generic.go:334] "Generic (PLEG): container finished" podID="482cdb14-c28c-44e0-8054-a5e782a71b54" containerID="108da259851c1ba2ffd7d1dfa1b1d661da113d8aeb71040116b96ace30277484" exitCode=0 Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.614144 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"482cdb14-c28c-44e0-8054-a5e782a71b54","Type":"ContainerDied","Data":"108da259851c1ba2ffd7d1dfa1b1d661da113d8aeb71040116b96ace30277484"} Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.624311 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50565ce8-ee16-43b2-af07-c92e7444546c","Type":"ContainerStarted","Data":"da113cf24848e954ed9b23ef08b50f3332b5cc6926c6620bbf66f305bbb97528"} Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.624622 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.695038 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.695452 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mrr\" (UniqueName: \"kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.695754 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.695854 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.697680 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.697665865 podStartE2EDuration="37.697665865s" podCreationTimestamp="2025-12-04 12:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:42:58.678866538 +0000 UTC m=+1781.720313115" watchObservedRunningTime="2025-12-04 12:42:58.697665865 +0000 UTC m=+1781.739112432" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.798113 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.798583 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mrr\" (UniqueName: \"kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.798656 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.798691 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.802855 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.803167 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.803713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.816628 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mrr\" (UniqueName: \"kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:58 crc kubenswrapper[4760]: I1204 12:42:58.914897 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.526973 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx"] Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.528304 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.637838 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"482cdb14-c28c-44e0-8054-a5e782a71b54","Type":"ContainerStarted","Data":"bde93e0ed568e0ce37a1ba832be1f31a878f3863a0517ae7c26c794d37d83c69"} Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.638697 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.642447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" event={"ID":"51e288c0-c373-4aa9-9c38-cb94fbeccf01","Type":"ContainerStarted","Data":"6d882a4d96fc33566a6533ac651f4d246c387903e0cdb924af654a2076f29773"} Dec 04 12:42:59 crc kubenswrapper[4760]: I1204 12:42:59.682965 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.682942372 podStartE2EDuration="36.682942372s" podCreationTimestamp="2025-12-04 12:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 12:42:59.673798083 +0000 UTC m=+1782.715244650" watchObservedRunningTime="2025-12-04 12:42:59.682942372 +0000 UTC m=+1782.724388939" Dec 04 12:43:01 crc kubenswrapper[4760]: E1204 12:43:01.475690 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:43:02 crc kubenswrapper[4760]: I1204 12:43:02.864875 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:43:02 crc kubenswrapper[4760]: E1204 12:43:02.865955 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:43:11 crc kubenswrapper[4760]: E1204 12:43:11.770641 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice/crio-8d4ee9196de35af8f6de59f9ccd73b672065f143b8ac57343d742cdc2d792d03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26d6eca_6891_4544_848a_4d34fd081522.slice\": RecentStats: unable to find data in memory cache]" Dec 04 12:43:11 crc kubenswrapper[4760]: I1204 12:43:11.795141 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" event={"ID":"51e288c0-c373-4aa9-9c38-cb94fbeccf01","Type":"ContainerStarted","Data":"45c41962ec59b7b52b9610d0829fdab4bd4029fbe81a48db355184da19d28b1a"} Dec 04 12:43:11 crc kubenswrapper[4760]: I1204 12:43:11.822415 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" podStartSLOduration=2.49718222 podStartE2EDuration="13.822387311s" podCreationTimestamp="2025-12-04 12:42:58 +0000 UTC" firstStartedPulling="2025-12-04 12:42:59.527976422 +0000 UTC m=+1782.569422989" lastFinishedPulling="2025-12-04 12:43:10.853181513 +0000 UTC m=+1793.894628080" observedRunningTime="2025-12-04 12:43:11.813379494 +0000 UTC m=+1794.854826061" watchObservedRunningTime="2025-12-04 12:43:11.822387311 +0000 UTC m=+1794.863833878" Dec 04 12:43:11 crc kubenswrapper[4760]: I1204 12:43:11.955486 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 12:43:13 crc kubenswrapper[4760]: I1204 12:43:13.939691 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 12:43:16 crc kubenswrapper[4760]: I1204 12:43:16.864505 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:43:16 crc kubenswrapper[4760]: E1204 12:43:16.865081 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:43:17 crc kubenswrapper[4760]: E1204 12:43:17.906066 4760 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/d8942c590eb15b08164789479b4f4f4bdf6260dbc7f7effdc2ffa72d12d767ff/diff" to get inode usage: stat /var/lib/containers/storage/overlay/d8942c590eb15b08164789479b4f4f4bdf6260dbc7f7effdc2ffa72d12d767ff/diff: no such file or directory, extraDiskErr: Dec 04 12:43:23 crc kubenswrapper[4760]: I1204 12:43:23.959407 4760 generic.go:334] "Generic (PLEG): container finished" podID="51e288c0-c373-4aa9-9c38-cb94fbeccf01" containerID="45c41962ec59b7b52b9610d0829fdab4bd4029fbe81a48db355184da19d28b1a" exitCode=0 Dec 04 12:43:23 crc kubenswrapper[4760]: I1204 12:43:23.959509 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" event={"ID":"51e288c0-c373-4aa9-9c38-cb94fbeccf01","Type":"ContainerDied","Data":"45c41962ec59b7b52b9610d0829fdab4bd4029fbe81a48db355184da19d28b1a"} Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.443018 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.589759 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2mrr\" (UniqueName: \"kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr\") pod \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.589886 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") pod \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.589911 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key\") pod \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.590184 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle\") pod \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.602840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr" (OuterVolumeSpecName: "kube-api-access-r2mrr") pod "51e288c0-c373-4aa9-9c38-cb94fbeccf01" (UID: "51e288c0-c373-4aa9-9c38-cb94fbeccf01"). InnerVolumeSpecName "kube-api-access-r2mrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.604587 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "51e288c0-c373-4aa9-9c38-cb94fbeccf01" (UID: "51e288c0-c373-4aa9-9c38-cb94fbeccf01"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:43:25 crc kubenswrapper[4760]: E1204 12:43:25.622919 4760 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory podName:51e288c0-c373-4aa9-9c38-cb94fbeccf01 nodeName:}" failed. No retries permitted until 2025-12-04 12:43:26.122874765 +0000 UTC m=+1809.164321332 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory") pod "51e288c0-c373-4aa9-9c38-cb94fbeccf01" (UID: "51e288c0-c373-4aa9-9c38-cb94fbeccf01") : error deleting /var/lib/kubelet/pods/51e288c0-c373-4aa9-9c38-cb94fbeccf01/volume-subpaths: remove /var/lib/kubelet/pods/51e288c0-c373-4aa9-9c38-cb94fbeccf01/volume-subpaths: no such file or directory Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.625896 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "51e288c0-c373-4aa9-9c38-cb94fbeccf01" (UID: "51e288c0-c373-4aa9-9c38-cb94fbeccf01"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.692891 4760 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.692926 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2mrr\" (UniqueName: \"kubernetes.io/projected/51e288c0-c373-4aa9-9c38-cb94fbeccf01-kube-api-access-r2mrr\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.692935 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.984045 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" event={"ID":"51e288c0-c373-4aa9-9c38-cb94fbeccf01","Type":"ContainerDied","Data":"6d882a4d96fc33566a6533ac651f4d246c387903e0cdb924af654a2076f29773"} Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.984085 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx" Dec 04 12:43:25 crc kubenswrapper[4760]: I1204 12:43:25.984097 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d882a4d96fc33566a6533ac651f4d246c387903e0cdb924af654a2076f29773" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.069555 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw"] Dec 04 12:43:26 crc kubenswrapper[4760]: E1204 12:43:26.070120 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e288c0-c373-4aa9-9c38-cb94fbeccf01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.070143 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e288c0-c373-4aa9-9c38-cb94fbeccf01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.070381 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e288c0-c373-4aa9-9c38-cb94fbeccf01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.071127 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.083540 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw"] Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.102587 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795vc\" (UniqueName: \"kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.103140 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.103233 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.204410 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") pod \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\" (UID: \"51e288c0-c373-4aa9-9c38-cb94fbeccf01\") " Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.204762 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.204843 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.204896 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795vc\" (UniqueName: \"kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.211684 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.211709 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.216624 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory" (OuterVolumeSpecName: "inventory") pod "51e288c0-c373-4aa9-9c38-cb94fbeccf01" (UID: "51e288c0-c373-4aa9-9c38-cb94fbeccf01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.229709 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795vc\" (UniqueName: \"kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pc5cw\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.307702 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e288c0-c373-4aa9-9c38-cb94fbeccf01-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.390827 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.918280 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw"] Dec 04 12:43:26 crc kubenswrapper[4760]: I1204 12:43:26.995991 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" event={"ID":"4ba039cb-b160-4ec8-9f00-a42e7bcce289","Type":"ContainerStarted","Data":"a7ed3bde7dcc9a41a5ef96d3851ef5ca01776ad56d28dce53eb407a077ee222d"} Dec 04 12:43:28 crc kubenswrapper[4760]: I1204 12:43:28.008817 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" event={"ID":"4ba039cb-b160-4ec8-9f00-a42e7bcce289","Type":"ContainerStarted","Data":"3eff32ae1b439e7013c47031cc364066e92d9a27ba831c45e0073c9544cfcc7f"} Dec 04 12:43:28 crc kubenswrapper[4760]: I1204 12:43:28.027256 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" podStartSLOduration=1.439500282 podStartE2EDuration="2.027228055s" podCreationTimestamp="2025-12-04 12:43:26 +0000 UTC" firstStartedPulling="2025-12-04 12:43:26.928617478 +0000 UTC m=+1809.970064045" lastFinishedPulling="2025-12-04 12:43:27.516345251 +0000 UTC m=+1810.557791818" observedRunningTime="2025-12-04 12:43:28.025806969 +0000 UTC m=+1811.067253526" watchObservedRunningTime="2025-12-04 12:43:28.027228055 +0000 UTC m=+1811.068674622" Dec 04 12:43:30 crc kubenswrapper[4760]: I1204 12:43:30.865288 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:43:30 crc kubenswrapper[4760]: E1204 12:43:30.865960 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:43:31 crc kubenswrapper[4760]: I1204 12:43:31.039528 4760 generic.go:334] "Generic (PLEG): container finished" podID="4ba039cb-b160-4ec8-9f00-a42e7bcce289" containerID="3eff32ae1b439e7013c47031cc364066e92d9a27ba831c45e0073c9544cfcc7f" exitCode=0 Dec 04 12:43:31 crc kubenswrapper[4760]: I1204 12:43:31.039592 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" event={"ID":"4ba039cb-b160-4ec8-9f00-a42e7bcce289","Type":"ContainerDied","Data":"3eff32ae1b439e7013c47031cc364066e92d9a27ba831c45e0073c9544cfcc7f"} Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.536448 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.669086 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key\") pod \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.669139 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795vc\" (UniqueName: \"kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc\") pod \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.669171 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory\") pod \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\" (UID: \"4ba039cb-b160-4ec8-9f00-a42e7bcce289\") " Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.675766 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc" (OuterVolumeSpecName: "kube-api-access-795vc") pod "4ba039cb-b160-4ec8-9f00-a42e7bcce289" (UID: "4ba039cb-b160-4ec8-9f00-a42e7bcce289"). InnerVolumeSpecName "kube-api-access-795vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.698924 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory" (OuterVolumeSpecName: "inventory") pod "4ba039cb-b160-4ec8-9f00-a42e7bcce289" (UID: "4ba039cb-b160-4ec8-9f00-a42e7bcce289"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.699345 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4ba039cb-b160-4ec8-9f00-a42e7bcce289" (UID: "4ba039cb-b160-4ec8-9f00-a42e7bcce289"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.772658 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.772698 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795vc\" (UniqueName: \"kubernetes.io/projected/4ba039cb-b160-4ec8-9f00-a42e7bcce289-kube-api-access-795vc\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:32 crc kubenswrapper[4760]: I1204 12:43:32.772713 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4ba039cb-b160-4ec8-9f00-a42e7bcce289-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.060856 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" event={"ID":"4ba039cb-b160-4ec8-9f00-a42e7bcce289","Type":"ContainerDied","Data":"a7ed3bde7dcc9a41a5ef96d3851ef5ca01776ad56d28dce53eb407a077ee222d"} Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.061248 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ed3bde7dcc9a41a5ef96d3851ef5ca01776ad56d28dce53eb407a077ee222d" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.060927 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pc5cw" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.185601 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r"] Dec 04 12:43:33 crc kubenswrapper[4760]: E1204 12:43:33.186342 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba039cb-b160-4ec8-9f00-a42e7bcce289" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.186370 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba039cb-b160-4ec8-9f00-a42e7bcce289" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.186630 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba039cb-b160-4ec8-9f00-a42e7bcce289" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.187456 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.190874 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.191145 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.191232 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.191409 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.201121 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r"] Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.290699 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.291025 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.291143 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchdw\" (UniqueName: \"kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.291295 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.394029 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.394187 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.394258 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchdw\" (UniqueName: \"kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.394324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.400300 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.400312 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.400650 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.418945 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchdw\" (UniqueName: \"kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:33 crc kubenswrapper[4760]: I1204 12:43:33.510737 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:43:34 crc kubenswrapper[4760]: I1204 12:43:34.078530 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r"] Dec 04 12:43:34 crc kubenswrapper[4760]: W1204 12:43:34.083304 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd552df6_e07d_4042_b4d7_8b154163e633.slice/crio-b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273 WatchSource:0}: Error finding container b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273: Status 404 returned error can't find the container with id b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273 Dec 04 12:43:35 crc kubenswrapper[4760]: I1204 12:43:35.130253 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" event={"ID":"fd552df6-e07d-4042-b4d7-8b154163e633","Type":"ContainerStarted","Data":"b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273"} Dec 04 12:43:36 crc kubenswrapper[4760]: I1204 12:43:36.155245 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" event={"ID":"fd552df6-e07d-4042-b4d7-8b154163e633","Type":"ContainerStarted","Data":"93d12f06cd001ae5f7904e6f0f1d0a98bdbe39746a2c9e1fa08c3a3e22a27ad2"} Dec 04 12:43:36 crc kubenswrapper[4760]: I1204 12:43:36.182781 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" podStartSLOduration=2.546033595 podStartE2EDuration="3.182755083s" podCreationTimestamp="2025-12-04 12:43:33 +0000 UTC" firstStartedPulling="2025-12-04 12:43:34.086171846 +0000 UTC m=+1817.127618403" lastFinishedPulling="2025-12-04 12:43:34.722893324 +0000 UTC m=+1817.764339891" observedRunningTime="2025-12-04 12:43:36.175370688 +0000 UTC m=+1819.216817285" watchObservedRunningTime="2025-12-04 12:43:36.182755083 +0000 UTC m=+1819.224201650" Dec 04 12:43:45 crc kubenswrapper[4760]: I1204 12:43:45.864609 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:43:45 crc kubenswrapper[4760]: E1204 12:43:45.865314 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:43:58 crc kubenswrapper[4760]: I1204 12:43:58.865505 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:43:58 crc kubenswrapper[4760]: E1204 12:43:58.867676 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:44:12 crc kubenswrapper[4760]: I1204 12:44:12.864378 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:44:12 crc kubenswrapper[4760]: E1204 12:44:12.865083 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:44:26 crc kubenswrapper[4760]: I1204 12:44:26.865027 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:44:26 crc kubenswrapper[4760]: E1204 12:44:26.865982 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:44:38 crc kubenswrapper[4760]: I1204 12:44:38.864936 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:44:38 crc kubenswrapper[4760]: E1204 12:44:38.866511 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:44:52 crc kubenswrapper[4760]: I1204 12:44:52.866284 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:44:52 crc kubenswrapper[4760]: E1204 12:44:52.866937 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.047272 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-069b-account-create-update-d2dm2"] Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.059969 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qnxc5"] Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.070809 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-069b-account-create-update-d2dm2"] Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.079133 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qnxc5"] Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.879956 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a470b427-0c54-4814-afcc-189f86207a0c" path="/var/lib/kubelet/pods/a470b427-0c54-4814-afcc-189f86207a0c/volumes" Dec 04 12:44:53 crc kubenswrapper[4760]: I1204 12:44:53.880926 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55a1d86-936b-439f-aef3-a2e55a596cd6" path="/var/lib/kubelet/pods/c55a1d86-936b-439f-aef3-a2e55a596cd6/volumes" Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.031974 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wd987"] Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.043001 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mnp4l"] Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.059286 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wd987"] Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.069501 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6133-account-create-update-gpcjt"] Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.078910 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mnp4l"] Dec 04 12:44:56 crc kubenswrapper[4760]: I1204 12:44:56.089769 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6133-account-create-update-gpcjt"] Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.042827 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3e42-account-create-update-h6rgv"] Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.055698 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3e42-account-create-update-h6rgv"] Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.704373 4760 scope.go:117] "RemoveContainer" containerID="29b91ba4f26a2f55c7ef89bec053bdfd9259fc178a4f2eefe549440980445bdf" Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.739345 4760 scope.go:117] "RemoveContainer" containerID="3ced422b3b35dcf011f420f9462005fbd18b90cd9806383302c006679292daee" Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.879153 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af69cbd-7106-43d8-9a09-e55a18ffa2bb" path="/var/lib/kubelet/pods/2af69cbd-7106-43d8-9a09-e55a18ffa2bb/volumes" Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.880455 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2e8b9f-43da-419b-8cf4-f96f3ef4c863" path="/var/lib/kubelet/pods/4e2e8b9f-43da-419b-8cf4-f96f3ef4c863/volumes" Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.881605 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a484ab2a-7803-4ce1-8ba6-32942c29c4d9" path="/var/lib/kubelet/pods/a484ab2a-7803-4ce1-8ba6-32942c29c4d9/volumes" Dec 04 12:44:57 crc kubenswrapper[4760]: I1204 12:44:57.882722 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b26f6a-29a2-4c94-8606-0b8d925489aa" path="/var/lib/kubelet/pods/b5b26f6a-29a2-4c94-8606-0b8d925489aa/volumes" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.214965 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8"] Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.217579 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.223802 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.224762 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.231553 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8"] Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.394330 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.394385 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.394466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z96gs\" (UniqueName: \"kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.497136 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z96gs\" (UniqueName: \"kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.497447 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.497499 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.498720 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.504705 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.517580 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z96gs\" (UniqueName: \"kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs\") pod \"collect-profiles-29414205-2qmf8\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:00 crc kubenswrapper[4760]: I1204 12:45:00.544673 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:01 crc kubenswrapper[4760]: I1204 12:45:01.035990 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8"] Dec 04 12:45:01 crc kubenswrapper[4760]: I1204 12:45:01.220447 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" event={"ID":"7c36c4e4-823f-4293-be73-174340e8074f","Type":"ContainerStarted","Data":"f253a6033841aafece78fa998897a94d7222d4b7653d03bd712f20a725f31bc1"} Dec 04 12:45:02 crc kubenswrapper[4760]: I1204 12:45:02.233435 4760 generic.go:334] "Generic (PLEG): container finished" podID="7c36c4e4-823f-4293-be73-174340e8074f" containerID="347187969c055364c9a7088863d1a84eb8c0de102adcab8babd897b07c0c683e" exitCode=0 Dec 04 12:45:02 crc kubenswrapper[4760]: I1204 12:45:02.233491 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" event={"ID":"7c36c4e4-823f-4293-be73-174340e8074f","Type":"ContainerDied","Data":"347187969c055364c9a7088863d1a84eb8c0de102adcab8babd897b07c0c683e"} Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.589471 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.681536 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z96gs\" (UniqueName: \"kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs\") pod \"7c36c4e4-823f-4293-be73-174340e8074f\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.681855 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume\") pod \"7c36c4e4-823f-4293-be73-174340e8074f\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.681895 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume\") pod \"7c36c4e4-823f-4293-be73-174340e8074f\" (UID: \"7c36c4e4-823f-4293-be73-174340e8074f\") " Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.683273 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c36c4e4-823f-4293-be73-174340e8074f" (UID: "7c36c4e4-823f-4293-be73-174340e8074f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.688781 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c36c4e4-823f-4293-be73-174340e8074f" (UID: "7c36c4e4-823f-4293-be73-174340e8074f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.688892 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs" (OuterVolumeSpecName: "kube-api-access-z96gs") pod "7c36c4e4-823f-4293-be73-174340e8074f" (UID: "7c36c4e4-823f-4293-be73-174340e8074f"). InnerVolumeSpecName "kube-api-access-z96gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.785468 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z96gs\" (UniqueName: \"kubernetes.io/projected/7c36c4e4-823f-4293-be73-174340e8074f-kube-api-access-z96gs\") on node \"crc\" DevicePath \"\"" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.785780 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c36c4e4-823f-4293-be73-174340e8074f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:45:03 crc kubenswrapper[4760]: I1204 12:45:03.785791 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c36c4e4-823f-4293-be73-174340e8074f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 12:45:04 crc kubenswrapper[4760]: I1204 12:45:04.254282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" event={"ID":"7c36c4e4-823f-4293-be73-174340e8074f","Type":"ContainerDied","Data":"f253a6033841aafece78fa998897a94d7222d4b7653d03bd712f20a725f31bc1"} Dec 04 12:45:04 crc kubenswrapper[4760]: I1204 12:45:04.254335 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f253a6033841aafece78fa998897a94d7222d4b7653d03bd712f20a725f31bc1" Dec 04 12:45:04 crc kubenswrapper[4760]: I1204 12:45:04.254346 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8" Dec 04 12:45:07 crc kubenswrapper[4760]: I1204 12:45:07.872492 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:45:07 crc kubenswrapper[4760]: E1204 12:45:07.873460 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:45:19 crc kubenswrapper[4760]: I1204 12:45:19.865707 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:45:19 crc kubenswrapper[4760]: E1204 12:45:19.866864 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:45:31 crc kubenswrapper[4760]: I1204 12:45:31.865054 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:45:31 crc kubenswrapper[4760]: E1204 12:45:31.866046 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.070237 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wc247"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.082569 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-dn8jg"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.100950 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-25bph"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.113619 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-xc9fh"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.128571 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-dn8jg"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.139969 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-25bph"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.151812 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b74a-account-create-update-tcwmk"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.163401 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wc247"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.173893 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-xc9fh"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.184543 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b74a-account-create-update-tcwmk"] Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.910456 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0642bc5b-899a-4334-80ff-4eac919be523" path="/var/lib/kubelet/pods/0642bc5b-899a-4334-80ff-4eac919be523/volumes" Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.911631 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e68130-6cfc-4349-9ca6-1eaa2690e632" path="/var/lib/kubelet/pods/34e68130-6cfc-4349-9ca6-1eaa2690e632/volumes" Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.915928 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8" path="/var/lib/kubelet/pods/cbe1a0db-c4a5-46d3-9d03-d719c5a5d7b8/volumes" Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.917240 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecaeba91-3e2d-404d-97fc-d43be6e0ac06" path="/var/lib/kubelet/pods/ecaeba91-3e2d-404d-97fc-d43be6e0ac06/volumes" Dec 04 12:45:45 crc kubenswrapper[4760]: I1204 12:45:45.917898 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bfa70a-49e1-4083-80fd-8e32e354de04" path="/var/lib/kubelet/pods/f2bfa70a-49e1-4083-80fd-8e32e354de04/volumes" Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.045832 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-705f-account-create-update-8dpgv"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.057937 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-cf87-account-create-update-8lng5"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.070261 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b62a-account-create-update-nv688"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.080611 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-705f-account-create-update-8dpgv"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.090008 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b62a-account-create-update-nv688"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.099693 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-cf87-account-create-update-8lng5"] Dec 04 12:45:46 crc kubenswrapper[4760]: I1204 12:45:46.864762 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:45:47 crc kubenswrapper[4760]: I1204 12:45:47.689188 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07"} Dec 04 12:45:47 crc kubenswrapper[4760]: I1204 12:45:47.876829 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fcc928e-076c-4853-9d29-56522dc04fd8" path="/var/lib/kubelet/pods/6fcc928e-076c-4853-9d29-56522dc04fd8/volumes" Dec 04 12:45:47 crc kubenswrapper[4760]: I1204 12:45:47.877699 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958c379a-8eea-4bb9-8e49-57a92168cf30" path="/var/lib/kubelet/pods/958c379a-8eea-4bb9-8e49-57a92168cf30/volumes" Dec 04 12:45:47 crc kubenswrapper[4760]: I1204 12:45:47.878368 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5367e8d-0a58-4d48-aada-d676ca7f78a0" path="/var/lib/kubelet/pods/e5367e8d-0a58-4d48-aada-d676ca7f78a0/volumes" Dec 04 12:45:52 crc kubenswrapper[4760]: I1204 12:45:52.043370 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vmvd8"] Dec 04 12:45:52 crc kubenswrapper[4760]: I1204 12:45:52.057516 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vmvd8"] Dec 04 12:45:53 crc kubenswrapper[4760]: I1204 12:45:53.875770 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74" path="/var/lib/kubelet/pods/2dfe9f73-f424-4a1c-9dbc-093fbdf2ed74/volumes" Dec 04 12:45:57 crc kubenswrapper[4760]: I1204 12:45:57.850140 4760 scope.go:117] "RemoveContainer" containerID="b5cdd98a02b38ffb70b102faa8289a81b610f49b6f2876055aca2d540f66beb4" Dec 04 12:45:57 crc kubenswrapper[4760]: I1204 12:45:57.918483 4760 scope.go:117] "RemoveContainer" containerID="46fb2cf46ada2f5103daba9a6a96a0b17b74c02c4bb8c8201176ed75f1269595" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.065933 4760 scope.go:117] "RemoveContainer" containerID="44370480737c59d5703c20d4787a738c2ec45845317fde8f91e5d2a5f0c8796a" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.106311 4760 scope.go:117] "RemoveContainer" containerID="da71358e64518aa19574b796d779780639ef71e800928d2177f0009f8213df53" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.167879 4760 scope.go:117] "RemoveContainer" containerID="8e2f1dc178baf56336675d0b42ac189f65f4b7541ca3c91397618e7f30d88e17" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.234687 4760 scope.go:117] "RemoveContainer" containerID="4ca983cdb6d8314e316ade854fe7c3ffb4778bffa38b0a188100360d08282d81" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.293628 4760 scope.go:117] "RemoveContainer" containerID="8eb75542bba557f19249684fb93dcca1b02bf49a5419db18cbf19c08cdb43f48" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.316574 4760 scope.go:117] "RemoveContainer" containerID="0d7d4acbb5200f5a67c6200cfe1f9c8f5d38a31052d0b86a271426134df2d2d7" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.338145 4760 scope.go:117] "RemoveContainer" containerID="0f91f22ee3cee8f1e541180a1829532b03212e89dddd8bb0c41db66a172d6964" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.378979 4760 scope.go:117] "RemoveContainer" containerID="b6ec6a304182e8a5c7440e1e6f632b72c538f3ec95d19031df662d96819be230" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.401566 4760 scope.go:117] "RemoveContainer" containerID="06922d91c6e7d0cddbbd78fa5d91ff6b27d98a19fa8acf6ae0e67d00590d1eed" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.427928 4760 scope.go:117] "RemoveContainer" containerID="55c9471a30cf978356ac54f66c947634b70256a14025c0a727307ad3c0cb08e8" Dec 04 12:45:58 crc kubenswrapper[4760]: I1204 12:45:58.462273 4760 scope.go:117] "RemoveContainer" containerID="3ec7550ef786a66bca683725bebc9dc87d1faceb96ba0c147f4d36051b39dd27" Dec 04 12:45:59 crc kubenswrapper[4760]: I1204 12:45:59.043744 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wwp5d"] Dec 04 12:45:59 crc kubenswrapper[4760]: I1204 12:45:59.055858 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wwp5d"] Dec 04 12:45:59 crc kubenswrapper[4760]: I1204 12:45:59.876728 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e75bad-e5f1-4db7-abd4-b64a956a01bd" path="/var/lib/kubelet/pods/e3e75bad-e5f1-4db7-abd4-b64a956a01bd/volumes" Dec 04 12:46:47 crc kubenswrapper[4760]: I1204 12:46:47.047960 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jl9zh"] Dec 04 12:46:47 crc kubenswrapper[4760]: I1204 12:46:47.058231 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jl9zh"] Dec 04 12:46:47 crc kubenswrapper[4760]: I1204 12:46:47.880634 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fefe31-76d7-458b-b4ef-fb49320cbb18" path="/var/lib/kubelet/pods/40fefe31-76d7-458b-b4ef-fb49320cbb18/volumes" Dec 04 12:46:58 crc kubenswrapper[4760]: I1204 12:46:58.726965 4760 scope.go:117] "RemoveContainer" containerID="5dcf7eeec99535ce79de885e26d28e82676115f5094fa8e367a51e5c8aef363b" Dec 04 12:46:58 crc kubenswrapper[4760]: I1204 12:46:58.751023 4760 scope.go:117] "RemoveContainer" containerID="5551e9a8c6bfbc4526afcd0cfd2904800d4cb09646ae96bf9e71ee6d7eab81e7" Dec 04 12:46:58 crc kubenswrapper[4760]: I1204 12:46:58.830414 4760 scope.go:117] "RemoveContainer" containerID="3b534b3870c0455f46706b086c68798186dd666caa94582c32cfb3cba0feb8c0" Dec 04 12:46:58 crc kubenswrapper[4760]: I1204 12:46:58.852984 4760 scope.go:117] "RemoveContainer" containerID="d252a93ff61731da07bb6354c4d3cf65978377efb0f7b3926d80b2e6af41de99" Dec 04 12:47:07 crc kubenswrapper[4760]: I1204 12:47:07.512196 4760 generic.go:334] "Generic (PLEG): container finished" podID="fd552df6-e07d-4042-b4d7-8b154163e633" containerID="93d12f06cd001ae5f7904e6f0f1d0a98bdbe39746a2c9e1fa08c3a3e22a27ad2" exitCode=0 Dec 04 12:47:07 crc kubenswrapper[4760]: I1204 12:47:07.512343 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" event={"ID":"fd552df6-e07d-4042-b4d7-8b154163e633","Type":"ContainerDied","Data":"93d12f06cd001ae5f7904e6f0f1d0a98bdbe39746a2c9e1fa08c3a3e22a27ad2"} Dec 04 12:47:08 crc kubenswrapper[4760]: I1204 12:47:08.983654 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.132119 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key\") pod \"fd552df6-e07d-4042-b4d7-8b154163e633\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.132559 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bchdw\" (UniqueName: \"kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw\") pod \"fd552df6-e07d-4042-b4d7-8b154163e633\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.132654 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory\") pod \"fd552df6-e07d-4042-b4d7-8b154163e633\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.132702 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle\") pod \"fd552df6-e07d-4042-b4d7-8b154163e633\" (UID: \"fd552df6-e07d-4042-b4d7-8b154163e633\") " Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.139514 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw" (OuterVolumeSpecName: "kube-api-access-bchdw") pod "fd552df6-e07d-4042-b4d7-8b154163e633" (UID: "fd552df6-e07d-4042-b4d7-8b154163e633"). InnerVolumeSpecName "kube-api-access-bchdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.141492 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fd552df6-e07d-4042-b4d7-8b154163e633" (UID: "fd552df6-e07d-4042-b4d7-8b154163e633"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.168351 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory" (OuterVolumeSpecName: "inventory") pod "fd552df6-e07d-4042-b4d7-8b154163e633" (UID: "fd552df6-e07d-4042-b4d7-8b154163e633"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.168889 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd552df6-e07d-4042-b4d7-8b154163e633" (UID: "fd552df6-e07d-4042-b4d7-8b154163e633"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.235945 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bchdw\" (UniqueName: \"kubernetes.io/projected/fd552df6-e07d-4042-b4d7-8b154163e633-kube-api-access-bchdw\") on node \"crc\" DevicePath \"\"" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.235988 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.236005 4760 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.236016 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd552df6-e07d-4042-b4d7-8b154163e633-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.540744 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" event={"ID":"fd552df6-e07d-4042-b4d7-8b154163e633","Type":"ContainerDied","Data":"b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273"} Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.540830 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53d0e0940c40adaa569eaaa8b91ae5ea31aa0ff4e7f3b8a533cf4be6286e273" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.540841 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.636242 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj"] Dec 04 12:47:09 crc kubenswrapper[4760]: E1204 12:47:09.637027 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c36c4e4-823f-4293-be73-174340e8074f" containerName="collect-profiles" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.637050 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c36c4e4-823f-4293-be73-174340e8074f" containerName="collect-profiles" Dec 04 12:47:09 crc kubenswrapper[4760]: E1204 12:47:09.637067 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd552df6-e07d-4042-b4d7-8b154163e633" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.637074 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd552df6-e07d-4042-b4d7-8b154163e633" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.637338 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c36c4e4-823f-4293-be73-174340e8074f" containerName="collect-profiles" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.637359 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd552df6-e07d-4042-b4d7-8b154163e633" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.638158 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.640783 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.640845 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.640788 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.640980 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.648032 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj"] Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.749914 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.750002 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.750078 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk52q\" (UniqueName: \"kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.852276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk52q\" (UniqueName: \"kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.852800 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.852978 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.857918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.857924 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.871839 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk52q\" (UniqueName: \"kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-89lhj\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:09 crc kubenswrapper[4760]: I1204 12:47:09.962445 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:47:10 crc kubenswrapper[4760]: I1204 12:47:10.534330 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj"] Dec 04 12:47:10 crc kubenswrapper[4760]: I1204 12:47:10.550835 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" event={"ID":"831952cf-f2b0-482f-bd5e-69dcf19821f9","Type":"ContainerStarted","Data":"f2a51b57af96bf562e9bdc12570a5f5694bd9c19520e2ff07f06c235ddc0bf88"} Dec 04 12:47:11 crc kubenswrapper[4760]: I1204 12:47:11.054397 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8n9vw"] Dec 04 12:47:11 crc kubenswrapper[4760]: I1204 12:47:11.066680 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8n9vw"] Dec 04 12:47:11 crc kubenswrapper[4760]: I1204 12:47:11.562768 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" event={"ID":"831952cf-f2b0-482f-bd5e-69dcf19821f9","Type":"ContainerStarted","Data":"f6219d8dfa435c18b232ecccf3169ed95e325633e8bdfe8ee65d0fdcbf022a6d"} Dec 04 12:47:11 crc kubenswrapper[4760]: I1204 12:47:11.590611 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" podStartSLOduration=2.16007657 podStartE2EDuration="2.59057058s" podCreationTimestamp="2025-12-04 12:47:09 +0000 UTC" firstStartedPulling="2025-12-04 12:47:10.540503405 +0000 UTC m=+2033.581949972" lastFinishedPulling="2025-12-04 12:47:10.970997415 +0000 UTC m=+2034.012443982" observedRunningTime="2025-12-04 12:47:11.583279718 +0000 UTC m=+2034.624726295" watchObservedRunningTime="2025-12-04 12:47:11.59057058 +0000 UTC m=+2034.632017147" Dec 04 12:47:11 crc kubenswrapper[4760]: I1204 12:47:11.887681 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fe1729-73ac-43f6-bc34-b335da33a7e6" path="/var/lib/kubelet/pods/30fe1729-73ac-43f6-bc34-b335da33a7e6/volumes" Dec 04 12:47:14 crc kubenswrapper[4760]: I1204 12:47:14.054478 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nmzx2"] Dec 04 12:47:14 crc kubenswrapper[4760]: I1204 12:47:14.067107 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nmzx2"] Dec 04 12:47:15 crc kubenswrapper[4760]: I1204 12:47:15.878817 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6ec201-ccbb-4003-9893-13b6656a1624" path="/var/lib/kubelet/pods/fe6ec201-ccbb-4003-9893-13b6656a1624/volumes" Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.049524 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rqwnf"] Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.061814 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-46wx2"] Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.073840 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rqwnf"] Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.084194 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-46wx2"] Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.880900 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640263be-b424-4ed1-b0f5-d4b9907113e2" path="/var/lib/kubelet/pods/640263be-b424-4ed1-b0f5-d4b9907113e2/volumes" Dec 04 12:47:25 crc kubenswrapper[4760]: I1204 12:47:25.881860 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8179a26-2281-4a5d-bc77-808a2f7e61bb" path="/var/lib/kubelet/pods/a8179a26-2281-4a5d-bc77-808a2f7e61bb/volumes" Dec 04 12:47:49 crc kubenswrapper[4760]: I1204 12:47:49.044845 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-rb22w"] Dec 04 12:47:49 crc kubenswrapper[4760]: I1204 12:47:49.054993 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-rb22w"] Dec 04 12:47:49 crc kubenswrapper[4760]: I1204 12:47:49.878073 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2d78cb-0c7a-408f-a736-6630b41bd80b" path="/var/lib/kubelet/pods/6e2d78cb-0c7a-408f-a736-6630b41bd80b/volumes" Dec 04 12:47:58 crc kubenswrapper[4760]: I1204 12:47:58.973056 4760 scope.go:117] "RemoveContainer" containerID="5c5493bd19b539ac81e35141045a228554df10121d25d9035c912ffed092266f" Dec 04 12:47:59 crc kubenswrapper[4760]: I1204 12:47:59.021446 4760 scope.go:117] "RemoveContainer" containerID="530aa0c9feea88fddde13fdf57142533cbbd84649730112263053ace7b0beff8" Dec 04 12:47:59 crc kubenswrapper[4760]: I1204 12:47:59.070248 4760 scope.go:117] "RemoveContainer" containerID="437b9589ac1c40aef31a422b9ee2eda92a071d0f18efbaf73fcf51148230d821" Dec 04 12:47:59 crc kubenswrapper[4760]: I1204 12:47:59.147546 4760 scope.go:117] "RemoveContainer" containerID="102b444d9290b954810a48ad0f3ee29b6f9730f44a5ede18f183bb8f908f06c3" Dec 04 12:47:59 crc kubenswrapper[4760]: I1204 12:47:59.199494 4760 scope.go:117] "RemoveContainer" containerID="5ba953f8ee30f593c2b31c939e252e989f85ce1c744c6b0bda4c4dd9dfefa0c1" Dec 04 12:48:03 crc kubenswrapper[4760]: I1204 12:48:03.380799 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:48:03 crc kubenswrapper[4760]: I1204 12:48:03.382405 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:48:33 crc kubenswrapper[4760]: I1204 12:48:33.380293 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:48:33 crc kubenswrapper[4760]: I1204 12:48:33.380947 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:48:49 crc kubenswrapper[4760]: I1204 12:48:49.595523 4760 generic.go:334] "Generic (PLEG): container finished" podID="831952cf-f2b0-482f-bd5e-69dcf19821f9" containerID="f6219d8dfa435c18b232ecccf3169ed95e325633e8bdfe8ee65d0fdcbf022a6d" exitCode=0 Dec 04 12:48:49 crc kubenswrapper[4760]: I1204 12:48:49.595777 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" event={"ID":"831952cf-f2b0-482f-bd5e-69dcf19821f9","Type":"ContainerDied","Data":"f6219d8dfa435c18b232ecccf3169ed95e325633e8bdfe8ee65d0fdcbf022a6d"} Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.114360 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.191972 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory\") pod \"831952cf-f2b0-482f-bd5e-69dcf19821f9\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.192388 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk52q\" (UniqueName: \"kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q\") pod \"831952cf-f2b0-482f-bd5e-69dcf19821f9\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.192562 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key\") pod \"831952cf-f2b0-482f-bd5e-69dcf19821f9\" (UID: \"831952cf-f2b0-482f-bd5e-69dcf19821f9\") " Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.200360 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q" (OuterVolumeSpecName: "kube-api-access-nk52q") pod "831952cf-f2b0-482f-bd5e-69dcf19821f9" (UID: "831952cf-f2b0-482f-bd5e-69dcf19821f9"). InnerVolumeSpecName "kube-api-access-nk52q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.225407 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory" (OuterVolumeSpecName: "inventory") pod "831952cf-f2b0-482f-bd5e-69dcf19821f9" (UID: "831952cf-f2b0-482f-bd5e-69dcf19821f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.225703 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "831952cf-f2b0-482f-bd5e-69dcf19821f9" (UID: "831952cf-f2b0-482f-bd5e-69dcf19821f9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.296752 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk52q\" (UniqueName: \"kubernetes.io/projected/831952cf-f2b0-482f-bd5e-69dcf19821f9-kube-api-access-nk52q\") on node \"crc\" DevicePath \"\"" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.296920 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.297040 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/831952cf-f2b0-482f-bd5e-69dcf19821f9-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.646015 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" event={"ID":"831952cf-f2b0-482f-bd5e-69dcf19821f9","Type":"ContainerDied","Data":"f2a51b57af96bf562e9bdc12570a5f5694bd9c19520e2ff07f06c235ddc0bf88"} Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.646083 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a51b57af96bf562e9bdc12570a5f5694bd9c19520e2ff07f06c235ddc0bf88" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.646185 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-89lhj" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.735429 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f"] Dec 04 12:48:51 crc kubenswrapper[4760]: E1204 12:48:51.736152 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831952cf-f2b0-482f-bd5e-69dcf19821f9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.736184 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="831952cf-f2b0-482f-bd5e-69dcf19821f9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.736512 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="831952cf-f2b0-482f-bd5e-69dcf19821f9" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.737623 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.740605 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.740626 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.740758 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.744020 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.748070 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f"] Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.809027 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.809401 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwztq\" (UniqueName: \"kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.809638 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.912324 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.912384 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwztq\" (UniqueName: \"kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.912479 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.917416 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.917689 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:51 crc kubenswrapper[4760]: I1204 12:48:51.930256 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwztq\" (UniqueName: \"kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9x22f\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:52 crc kubenswrapper[4760]: I1204 12:48:52.060196 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:48:52 crc kubenswrapper[4760]: I1204 12:48:52.621352 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f"] Dec 04 12:48:52 crc kubenswrapper[4760]: I1204 12:48:52.633387 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:48:52 crc kubenswrapper[4760]: I1204 12:48:52.661007 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" event={"ID":"7d193294-81b1-457c-99d0-9701df78978b","Type":"ContainerStarted","Data":"f3432f7e307d718ab5ff8d49596248c93ee4c1ef621369a987c5aff2007f5033"} Dec 04 12:48:53 crc kubenswrapper[4760]: I1204 12:48:53.690951 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" event={"ID":"7d193294-81b1-457c-99d0-9701df78978b","Type":"ContainerStarted","Data":"7be6ad7f6fcc0bdad5ca5308158139a14bab2d2554fa212c86bc52bd6419aa09"} Dec 04 12:48:53 crc kubenswrapper[4760]: I1204 12:48:53.717703 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" podStartSLOduration=2.25493909 podStartE2EDuration="2.717676905s" podCreationTimestamp="2025-12-04 12:48:51 +0000 UTC" firstStartedPulling="2025-12-04 12:48:52.633130084 +0000 UTC m=+2135.674576651" lastFinishedPulling="2025-12-04 12:48:53.095867899 +0000 UTC m=+2136.137314466" observedRunningTime="2025-12-04 12:48:53.71341788 +0000 UTC m=+2136.754864467" watchObservedRunningTime="2025-12-04 12:48:53.717676905 +0000 UTC m=+2136.759123472" Dec 04 12:48:56 crc kubenswrapper[4760]: I1204 12:48:56.054918 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7vhlm"] Dec 04 12:48:56 crc kubenswrapper[4760]: I1204 12:48:56.066727 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7vhlm"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.042770 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9784-account-create-update-vw2jz"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.052651 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9fc7-account-create-update-6ctmc"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.066422 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6c68-account-create-update-96z8z"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.078525 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5f284"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.088260 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nszzz"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.097734 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9784-account-create-update-vw2jz"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.107509 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9fc7-account-create-update-6ctmc"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.117070 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5f284"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.127549 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6c68-account-create-update-96z8z"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.137275 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nszzz"] Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.876229 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36730972-a8b8-4dd7-8335-805b4b694e42" path="/var/lib/kubelet/pods/36730972-a8b8-4dd7-8335-805b4b694e42/volumes" Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.877184 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4fdc4d-3353-4e41-8218-282fef7f1418" path="/var/lib/kubelet/pods/3b4fdc4d-3353-4e41-8218-282fef7f1418/volumes" Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.877764 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87816d08-ea62-428d-a43e-40b3e030afb5" path="/var/lib/kubelet/pods/87816d08-ea62-428d-a43e-40b3e030afb5/volumes" Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.878449 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9053d7ed-6d5e-44fe-ac2d-17ee4719a590" path="/var/lib/kubelet/pods/9053d7ed-6d5e-44fe-ac2d-17ee4719a590/volumes" Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.879558 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0b6f32-b982-4267-a0b8-b977b91f187c" path="/var/lib/kubelet/pods/9e0b6f32-b982-4267-a0b8-b977b91f187c/volumes" Dec 04 12:48:57 crc kubenswrapper[4760]: I1204 12:48:57.880147 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b417eac6-2ecb-42b0-a9ad-23860eaefde3" path="/var/lib/kubelet/pods/b417eac6-2ecb-42b0-a9ad-23860eaefde3/volumes" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.366939 4760 scope.go:117] "RemoveContainer" containerID="94a423a2ffd13e04dbd4b909957b6c76b3ad60e18a51e38a6d7bfcbb18e243a9" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.394936 4760 scope.go:117] "RemoveContainer" containerID="b62f6d2b0afedecc875b3ba8b0f7a8c578b971238d9e7cbeb5f09a19e8f27452" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.454913 4760 scope.go:117] "RemoveContainer" containerID="10ecf2515015ca9ef4e4759055a22fa3131ff6f83c7b98b7545f64f888176d88" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.521085 4760 scope.go:117] "RemoveContainer" containerID="eaf187efe64777a275756f0b4323aef50b158ed5d1768e9779421cbbe1d68feb" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.569811 4760 scope.go:117] "RemoveContainer" containerID="63486cd05984cc68b3dc2624522f8411eb759180ab672a45d3d30cd335427950" Dec 04 12:48:59 crc kubenswrapper[4760]: I1204 12:48:59.640928 4760 scope.go:117] "RemoveContainer" containerID="51d4a3aaa35392c3a3ba9d4e0a4396713a8d4da9ffefd319c448ad7cfea5c85f" Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.380772 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.381479 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.381548 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.382766 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.382857 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07" gracePeriod=600 Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.914071 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07" exitCode=0 Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.919253 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07"} Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.919665 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f"} Dec 04 12:49:03 crc kubenswrapper[4760]: I1204 12:49:03.919751 4760 scope.go:117] "RemoveContainer" containerID="e0bfad41a3aed0e54e77496a4baf4001c47f43a41c7fd6c93f23dbabec5f354a" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.525055 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.528023 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.545390 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.640625 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.640761 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.640845 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dllrk\" (UniqueName: \"kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.743675 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.744094 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.744163 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dllrk\" (UniqueName: \"kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.744386 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.744640 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.770770 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dllrk\" (UniqueName: \"kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk\") pod \"certified-operators-bmplk\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:12 crc kubenswrapper[4760]: I1204 12:49:12.864508 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:13 crc kubenswrapper[4760]: I1204 12:49:13.443703 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:14 crc kubenswrapper[4760]: I1204 12:49:14.067056 4760 generic.go:334] "Generic (PLEG): container finished" podID="da4540cb-c77d-4004-8d38-582d177c9b24" containerID="25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16" exitCode=0 Dec 04 12:49:14 crc kubenswrapper[4760]: I1204 12:49:14.067174 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerDied","Data":"25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16"} Dec 04 12:49:14 crc kubenswrapper[4760]: I1204 12:49:14.067402 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerStarted","Data":"389c7b3e6f5dda53f6ee743e85578694954d7fabe67b2b40cfae6d0aa9d2526c"} Dec 04 12:49:15 crc kubenswrapper[4760]: I1204 12:49:15.080110 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerStarted","Data":"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59"} Dec 04 12:49:17 crc kubenswrapper[4760]: I1204 12:49:17.111482 4760 generic.go:334] "Generic (PLEG): container finished" podID="da4540cb-c77d-4004-8d38-582d177c9b24" containerID="fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59" exitCode=0 Dec 04 12:49:17 crc kubenswrapper[4760]: I1204 12:49:17.111592 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerDied","Data":"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59"} Dec 04 12:49:18 crc kubenswrapper[4760]: I1204 12:49:18.125104 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerStarted","Data":"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a"} Dec 04 12:49:18 crc kubenswrapper[4760]: I1204 12:49:18.153079 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmplk" podStartSLOduration=2.720201206 podStartE2EDuration="6.153036185s" podCreationTimestamp="2025-12-04 12:49:12 +0000 UTC" firstStartedPulling="2025-12-04 12:49:14.069641534 +0000 UTC m=+2157.111088111" lastFinishedPulling="2025-12-04 12:49:17.502476523 +0000 UTC m=+2160.543923090" observedRunningTime="2025-12-04 12:49:18.1478555 +0000 UTC m=+2161.189302067" watchObservedRunningTime="2025-12-04 12:49:18.153036185 +0000 UTC m=+2161.194482762" Dec 04 12:49:22 crc kubenswrapper[4760]: I1204 12:49:22.865909 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:22 crc kubenswrapper[4760]: I1204 12:49:22.866453 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:22 crc kubenswrapper[4760]: I1204 12:49:22.917640 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:23 crc kubenswrapper[4760]: I1204 12:49:23.228856 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:23 crc kubenswrapper[4760]: I1204 12:49:23.277557 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.197018 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmplk" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="registry-server" containerID="cri-o://7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a" gracePeriod=2 Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.641685 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.782204 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content\") pod \"da4540cb-c77d-4004-8d38-582d177c9b24\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.782503 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dllrk\" (UniqueName: \"kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk\") pod \"da4540cb-c77d-4004-8d38-582d177c9b24\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.782601 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities\") pod \"da4540cb-c77d-4004-8d38-582d177c9b24\" (UID: \"da4540cb-c77d-4004-8d38-582d177c9b24\") " Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.783619 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities" (OuterVolumeSpecName: "utilities") pod "da4540cb-c77d-4004-8d38-582d177c9b24" (UID: "da4540cb-c77d-4004-8d38-582d177c9b24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.791344 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk" (OuterVolumeSpecName: "kube-api-access-dllrk") pod "da4540cb-c77d-4004-8d38-582d177c9b24" (UID: "da4540cb-c77d-4004-8d38-582d177c9b24"). InnerVolumeSpecName "kube-api-access-dllrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.839020 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da4540cb-c77d-4004-8d38-582d177c9b24" (UID: "da4540cb-c77d-4004-8d38-582d177c9b24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.884940 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.884988 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dllrk\" (UniqueName: \"kubernetes.io/projected/da4540cb-c77d-4004-8d38-582d177c9b24-kube-api-access-dllrk\") on node \"crc\" DevicePath \"\"" Dec 04 12:49:25 crc kubenswrapper[4760]: I1204 12:49:25.885005 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da4540cb-c77d-4004-8d38-582d177c9b24-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.209139 4760 generic.go:334] "Generic (PLEG): container finished" podID="da4540cb-c77d-4004-8d38-582d177c9b24" containerID="7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a" exitCode=0 Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.209193 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerDied","Data":"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a"} Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.209251 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmplk" event={"ID":"da4540cb-c77d-4004-8d38-582d177c9b24","Type":"ContainerDied","Data":"389c7b3e6f5dda53f6ee743e85578694954d7fabe67b2b40cfae6d0aa9d2526c"} Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.209259 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmplk" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.209293 4760 scope.go:117] "RemoveContainer" containerID="7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.242745 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.245543 4760 scope.go:117] "RemoveContainer" containerID="fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.260971 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bmplk"] Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.284671 4760 scope.go:117] "RemoveContainer" containerID="25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.325589 4760 scope.go:117] "RemoveContainer" containerID="7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a" Dec 04 12:49:26 crc kubenswrapper[4760]: E1204 12:49:26.327053 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a\": container with ID starting with 7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a not found: ID does not exist" containerID="7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.327103 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a"} err="failed to get container status \"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a\": rpc error: code = NotFound desc = could not find container \"7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a\": container with ID starting with 7b5801be49f64c4e64855ed9faa94cea92252fa9bd3561b47d77a0441e39723a not found: ID does not exist" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.327137 4760 scope.go:117] "RemoveContainer" containerID="fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59" Dec 04 12:49:26 crc kubenswrapper[4760]: E1204 12:49:26.327800 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59\": container with ID starting with fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59 not found: ID does not exist" containerID="fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.327871 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59"} err="failed to get container status \"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59\": rpc error: code = NotFound desc = could not find container \"fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59\": container with ID starting with fdb3c9fa3908b962b666f9d3e62a8d5ae23862c9e3bbfe1ef7b0316e8a823d59 not found: ID does not exist" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.327910 4760 scope.go:117] "RemoveContainer" containerID="25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16" Dec 04 12:49:26 crc kubenswrapper[4760]: E1204 12:49:26.328428 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16\": container with ID starting with 25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16 not found: ID does not exist" containerID="25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16" Dec 04 12:49:26 crc kubenswrapper[4760]: I1204 12:49:26.328453 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16"} err="failed to get container status \"25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16\": rpc error: code = NotFound desc = could not find container \"25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16\": container with ID starting with 25ea4822c9f8b4196434e440f0077e8526ae37bfde44099707697d7aa96c1f16 not found: ID does not exist" Dec 04 12:49:27 crc kubenswrapper[4760]: I1204 12:49:27.875869 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" path="/var/lib/kubelet/pods/da4540cb-c77d-4004-8d38-582d177c9b24/volumes" Dec 04 12:49:49 crc kubenswrapper[4760]: I1204 12:49:49.044474 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jpsv5"] Dec 04 12:49:49 crc kubenswrapper[4760]: I1204 12:49:49.058093 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-jpsv5"] Dec 04 12:49:49 crc kubenswrapper[4760]: I1204 12:49:49.878056 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ceeb719-ebc2-4aff-9b92-0e256d9c3c56" path="/var/lib/kubelet/pods/1ceeb719-ebc2-4aff-9b92-0e256d9c3c56/volumes" Dec 04 12:49:59 crc kubenswrapper[4760]: I1204 12:49:59.818582 4760 scope.go:117] "RemoveContainer" containerID="a480274240e5ff5d085247653af7f226604c60b910d2f6af1562ab79ff541799" Dec 04 12:50:06 crc kubenswrapper[4760]: I1204 12:50:06.641552 4760 generic.go:334] "Generic (PLEG): container finished" podID="7d193294-81b1-457c-99d0-9701df78978b" containerID="7be6ad7f6fcc0bdad5ca5308158139a14bab2d2554fa212c86bc52bd6419aa09" exitCode=0 Dec 04 12:50:06 crc kubenswrapper[4760]: I1204 12:50:06.641651 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" event={"ID":"7d193294-81b1-457c-99d0-9701df78978b","Type":"ContainerDied","Data":"7be6ad7f6fcc0bdad5ca5308158139a14bab2d2554fa212c86bc52bd6419aa09"} Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.188798 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.305430 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwztq\" (UniqueName: \"kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq\") pod \"7d193294-81b1-457c-99d0-9701df78978b\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.305618 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key\") pod \"7d193294-81b1-457c-99d0-9701df78978b\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.305742 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory\") pod \"7d193294-81b1-457c-99d0-9701df78978b\" (UID: \"7d193294-81b1-457c-99d0-9701df78978b\") " Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.311641 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq" (OuterVolumeSpecName: "kube-api-access-xwztq") pod "7d193294-81b1-457c-99d0-9701df78978b" (UID: "7d193294-81b1-457c-99d0-9701df78978b"). InnerVolumeSpecName "kube-api-access-xwztq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.337665 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d193294-81b1-457c-99d0-9701df78978b" (UID: "7d193294-81b1-457c-99d0-9701df78978b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.340457 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory" (OuterVolumeSpecName: "inventory") pod "7d193294-81b1-457c-99d0-9701df78978b" (UID: "7d193294-81b1-457c-99d0-9701df78978b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.409326 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwztq\" (UniqueName: \"kubernetes.io/projected/7d193294-81b1-457c-99d0-9701df78978b-kube-api-access-xwztq\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.410118 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.410168 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d193294-81b1-457c-99d0-9701df78978b-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.665443 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" event={"ID":"7d193294-81b1-457c-99d0-9701df78978b","Type":"ContainerDied","Data":"f3432f7e307d718ab5ff8d49596248c93ee4c1ef621369a987c5aff2007f5033"} Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.665533 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3432f7e307d718ab5ff8d49596248c93ee4c1ef621369a987c5aff2007f5033" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.665498 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9x22f" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.774910 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w"] Dec 04 12:50:08 crc kubenswrapper[4760]: E1204 12:50:08.775492 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d193294-81b1-457c-99d0-9701df78978b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775521 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d193294-81b1-457c-99d0-9701df78978b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:08 crc kubenswrapper[4760]: E1204 12:50:08.775551 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="extract-content" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775559 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="extract-content" Dec 04 12:50:08 crc kubenswrapper[4760]: E1204 12:50:08.775585 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="registry-server" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775591 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="registry-server" Dec 04 12:50:08 crc kubenswrapper[4760]: E1204 12:50:08.775611 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="extract-utilities" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775618 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="extract-utilities" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775883 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d193294-81b1-457c-99d0-9701df78978b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.775931 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4540cb-c77d-4004-8d38-582d177c9b24" containerName="registry-server" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.776860 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.779563 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.780644 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.780942 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.781127 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.795914 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w"] Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.922142 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.922269 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9jhx\" (UniqueName: \"kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:08 crc kubenswrapper[4760]: I1204 12:50:08.922327 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.024371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.024503 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jhx\" (UniqueName: \"kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.024565 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.028717 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.029020 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.046276 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9jhx\" (UniqueName: \"kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-59m8w\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.108486 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:09 crc kubenswrapper[4760]: I1204 12:50:09.696313 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w"] Dec 04 12:50:10 crc kubenswrapper[4760]: I1204 12:50:10.687028 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" event={"ID":"85fd1b45-21c2-4541-bac9-ce63eddbc242","Type":"ContainerStarted","Data":"695c944b2513df1bd8ec2454ba37ea4b993d33b00a4a2b0373de0f9d616e06c6"} Dec 04 12:50:10 crc kubenswrapper[4760]: I1204 12:50:10.687079 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" event={"ID":"85fd1b45-21c2-4541-bac9-ce63eddbc242","Type":"ContainerStarted","Data":"2707b1c7ea0037dfb3698784c2fc788e52582e0ad91332b8cc3ee9be39db3c6d"} Dec 04 12:50:10 crc kubenswrapper[4760]: I1204 12:50:10.721291 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" podStartSLOduration=2.168578728 podStartE2EDuration="2.721262929s" podCreationTimestamp="2025-12-04 12:50:08 +0000 UTC" firstStartedPulling="2025-12-04 12:50:09.706479968 +0000 UTC m=+2212.747926535" lastFinishedPulling="2025-12-04 12:50:10.259164169 +0000 UTC m=+2213.300610736" observedRunningTime="2025-12-04 12:50:10.70873919 +0000 UTC m=+2213.750185767" watchObservedRunningTime="2025-12-04 12:50:10.721262929 +0000 UTC m=+2213.762709506" Dec 04 12:50:15 crc kubenswrapper[4760]: I1204 12:50:15.747829 4760 generic.go:334] "Generic (PLEG): container finished" podID="85fd1b45-21c2-4541-bac9-ce63eddbc242" containerID="695c944b2513df1bd8ec2454ba37ea4b993d33b00a4a2b0373de0f9d616e06c6" exitCode=0 Dec 04 12:50:15 crc kubenswrapper[4760]: I1204 12:50:15.747954 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" event={"ID":"85fd1b45-21c2-4541-bac9-ce63eddbc242","Type":"ContainerDied","Data":"695c944b2513df1bd8ec2454ba37ea4b993d33b00a4a2b0373de0f9d616e06c6"} Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.251926 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.437903 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9jhx\" (UniqueName: \"kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx\") pod \"85fd1b45-21c2-4541-bac9-ce63eddbc242\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.438131 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory\") pod \"85fd1b45-21c2-4541-bac9-ce63eddbc242\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.438288 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key\") pod \"85fd1b45-21c2-4541-bac9-ce63eddbc242\" (UID: \"85fd1b45-21c2-4541-bac9-ce63eddbc242\") " Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.447645 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx" (OuterVolumeSpecName: "kube-api-access-v9jhx") pod "85fd1b45-21c2-4541-bac9-ce63eddbc242" (UID: "85fd1b45-21c2-4541-bac9-ce63eddbc242"). InnerVolumeSpecName "kube-api-access-v9jhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.469543 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85fd1b45-21c2-4541-bac9-ce63eddbc242" (UID: "85fd1b45-21c2-4541-bac9-ce63eddbc242"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.473233 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory" (OuterVolumeSpecName: "inventory") pod "85fd1b45-21c2-4541-bac9-ce63eddbc242" (UID: "85fd1b45-21c2-4541-bac9-ce63eddbc242"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.542112 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.542164 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9jhx\" (UniqueName: \"kubernetes.io/projected/85fd1b45-21c2-4541-bac9-ce63eddbc242-kube-api-access-v9jhx\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.542181 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85fd1b45-21c2-4541-bac9-ce63eddbc242-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.772076 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" event={"ID":"85fd1b45-21c2-4541-bac9-ce63eddbc242","Type":"ContainerDied","Data":"2707b1c7ea0037dfb3698784c2fc788e52582e0ad91332b8cc3ee9be39db3c6d"} Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.772157 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2707b1c7ea0037dfb3698784c2fc788e52582e0ad91332b8cc3ee9be39db3c6d" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.772377 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-59m8w" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.900257 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm"] Dec 04 12:50:17 crc kubenswrapper[4760]: E1204 12:50:17.901148 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fd1b45-21c2-4541-bac9-ce63eddbc242" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.901175 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fd1b45-21c2-4541-bac9-ce63eddbc242" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.901564 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fd1b45-21c2-4541-bac9-ce63eddbc242" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.902566 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.906813 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.906859 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.906987 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.909763 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:50:17 crc kubenswrapper[4760]: I1204 12:50:17.910255 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm"] Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.049564 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mtwzp"] Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.055050 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.055201 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.055308 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwpt4\" (UniqueName: \"kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.063055 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mtwzp"] Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.157993 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.158227 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwpt4\" (UniqueName: \"kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.158753 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.163359 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.163736 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.180348 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwpt4\" (UniqueName: \"kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-b7gjm\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.233400 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:50:18 crc kubenswrapper[4760]: I1204 12:50:18.857762 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm"] Dec 04 12:50:19 crc kubenswrapper[4760]: I1204 12:50:19.794676 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" event={"ID":"38e0514a-720e-4407-9e18-9fff5e901aab","Type":"ContainerStarted","Data":"469b6b9290af18639589d979cc852d903387bdd6fb66d20ae6e8a16d7d1d9023"} Dec 04 12:50:19 crc kubenswrapper[4760]: I1204 12:50:19.795338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" event={"ID":"38e0514a-720e-4407-9e18-9fff5e901aab","Type":"ContainerStarted","Data":"98b4dfd2ef4ddbbc6a29c414895808e212691cb322c07f841c8a85bbcfe3be3c"} Dec 04 12:50:19 crc kubenswrapper[4760]: I1204 12:50:19.820435 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" podStartSLOduration=2.447196373 podStartE2EDuration="2.820397428s" podCreationTimestamp="2025-12-04 12:50:17 +0000 UTC" firstStartedPulling="2025-12-04 12:50:18.860061989 +0000 UTC m=+2221.901508556" lastFinishedPulling="2025-12-04 12:50:19.233263044 +0000 UTC m=+2222.274709611" observedRunningTime="2025-12-04 12:50:19.812608431 +0000 UTC m=+2222.854054998" watchObservedRunningTime="2025-12-04 12:50:19.820397428 +0000 UTC m=+2222.861844015" Dec 04 12:50:19 crc kubenswrapper[4760]: I1204 12:50:19.877127 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b979c43-e46a-4104-a17a-fb065693bbbc" path="/var/lib/kubelet/pods/1b979c43-e46a-4104-a17a-fb065693bbbc/volumes" Dec 04 12:50:22 crc kubenswrapper[4760]: I1204 12:50:22.041556 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dqc6v"] Dec 04 12:50:22 crc kubenswrapper[4760]: I1204 12:50:22.051457 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dqc6v"] Dec 04 12:50:23 crc kubenswrapper[4760]: I1204 12:50:23.894910 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b1fcf1-5d52-4713-bfa5-2856128d6df5" path="/var/lib/kubelet/pods/20b1fcf1-5d52-4713-bfa5-2856128d6df5/volumes" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.386717 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.408467 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.430754 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.473135 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzv5\" (UniqueName: \"kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.473439 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.473897 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.575939 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.576055 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzv5\" (UniqueName: \"kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.576153 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.576842 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.576851 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.603028 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzv5\" (UniqueName: \"kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5\") pod \"redhat-operators-tfr9b\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:55 crc kubenswrapper[4760]: I1204 12:50:55.761477 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:50:56 crc kubenswrapper[4760]: I1204 12:50:56.377685 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:50:57 crc kubenswrapper[4760]: I1204 12:50:57.187560 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerID="3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63" exitCode=0 Dec 04 12:50:57 crc kubenswrapper[4760]: I1204 12:50:57.187621 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerDied","Data":"3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63"} Dec 04 12:50:57 crc kubenswrapper[4760]: I1204 12:50:57.187860 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerStarted","Data":"347080cd419f67d8f20186912046f23b4513f7d51fadcf7a707bba3053fa4c98"} Dec 04 12:50:58 crc kubenswrapper[4760]: I1204 12:50:58.200117 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerStarted","Data":"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e"} Dec 04 12:50:59 crc kubenswrapper[4760]: I1204 12:50:59.916353 4760 scope.go:117] "RemoveContainer" containerID="c897eb25114cb298991bc8c864cf2dbd15edd40eea2aa954f566c207bc2d2495" Dec 04 12:50:59 crc kubenswrapper[4760]: I1204 12:50:59.970869 4760 scope.go:117] "RemoveContainer" containerID="1aadefe6022699d485dbabbf02459c00a1eb7dc7dbf3fc530898205522cdbaea" Dec 04 12:51:00 crc kubenswrapper[4760]: I1204 12:51:00.221933 4760 generic.go:334] "Generic (PLEG): container finished" podID="38e0514a-720e-4407-9e18-9fff5e901aab" containerID="469b6b9290af18639589d979cc852d903387bdd6fb66d20ae6e8a16d7d1d9023" exitCode=0 Dec 04 12:51:00 crc kubenswrapper[4760]: I1204 12:51:00.221994 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" event={"ID":"38e0514a-720e-4407-9e18-9fff5e901aab","Type":"ContainerDied","Data":"469b6b9290af18639589d979cc852d903387bdd6fb66d20ae6e8a16d7d1d9023"} Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.702838 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.844295 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwpt4\" (UniqueName: \"kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4\") pod \"38e0514a-720e-4407-9e18-9fff5e901aab\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.844391 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key\") pod \"38e0514a-720e-4407-9e18-9fff5e901aab\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.844660 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory\") pod \"38e0514a-720e-4407-9e18-9fff5e901aab\" (UID: \"38e0514a-720e-4407-9e18-9fff5e901aab\") " Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.851865 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4" (OuterVolumeSpecName: "kube-api-access-bwpt4") pod "38e0514a-720e-4407-9e18-9fff5e901aab" (UID: "38e0514a-720e-4407-9e18-9fff5e901aab"). InnerVolumeSpecName "kube-api-access-bwpt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.881444 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "38e0514a-720e-4407-9e18-9fff5e901aab" (UID: "38e0514a-720e-4407-9e18-9fff5e901aab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.881914 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory" (OuterVolumeSpecName: "inventory") pod "38e0514a-720e-4407-9e18-9fff5e901aab" (UID: "38e0514a-720e-4407-9e18-9fff5e901aab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.947965 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.948009 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwpt4\" (UniqueName: \"kubernetes.io/projected/38e0514a-720e-4407-9e18-9fff5e901aab-kube-api-access-bwpt4\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:01 crc kubenswrapper[4760]: I1204 12:51:01.948025 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/38e0514a-720e-4407-9e18-9fff5e901aab-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.245786 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" event={"ID":"38e0514a-720e-4407-9e18-9fff5e901aab","Type":"ContainerDied","Data":"98b4dfd2ef4ddbbc6a29c414895808e212691cb322c07f841c8a85bbcfe3be3c"} Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.245847 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b4dfd2ef4ddbbc6a29c414895808e212691cb322c07f841c8a85bbcfe3be3c" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.245957 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-b7gjm" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.255458 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerID="675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e" exitCode=0 Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.255522 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerDied","Data":"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e"} Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.353415 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6"] Dec 04 12:51:02 crc kubenswrapper[4760]: E1204 12:51:02.354426 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38e0514a-720e-4407-9e18-9fff5e901aab" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.354449 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="38e0514a-720e-4407-9e18-9fff5e901aab" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.354680 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="38e0514a-720e-4407-9e18-9fff5e901aab" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.355677 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.360610 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.360953 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.361106 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.361362 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.367226 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6"] Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.459298 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbqz\" (UniqueName: \"kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.459587 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.460136 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.562429 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.562593 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbqz\" (UniqueName: \"kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.562662 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.568886 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.568954 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.583099 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbqz\" (UniqueName: \"kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:02 crc kubenswrapper[4760]: I1204 12:51:02.697429 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:03 crc kubenswrapper[4760]: I1204 12:51:03.274999 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerStarted","Data":"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0"} Dec 04 12:51:03 crc kubenswrapper[4760]: I1204 12:51:03.336861 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tfr9b" podStartSLOduration=2.8416467880000003 podStartE2EDuration="8.336835038s" podCreationTimestamp="2025-12-04 12:50:55 +0000 UTC" firstStartedPulling="2025-12-04 12:50:57.191002005 +0000 UTC m=+2260.232448572" lastFinishedPulling="2025-12-04 12:51:02.686190255 +0000 UTC m=+2265.727636822" observedRunningTime="2025-12-04 12:51:03.326033605 +0000 UTC m=+2266.367480172" watchObservedRunningTime="2025-12-04 12:51:03.336835038 +0000 UTC m=+2266.378281605" Dec 04 12:51:03 crc kubenswrapper[4760]: I1204 12:51:03.380775 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:51:03 crc kubenswrapper[4760]: I1204 12:51:03.380852 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:51:03 crc kubenswrapper[4760]: I1204 12:51:03.508422 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6"] Dec 04 12:51:04 crc kubenswrapper[4760]: I1204 12:51:04.287921 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" event={"ID":"c5a0aee6-7728-4bcf-8361-93bc45069c7f","Type":"ContainerStarted","Data":"d7d24762d4e956d7075a39231a946c7d1fd11d49b5b8eb43510c6206e1dc8792"} Dec 04 12:51:04 crc kubenswrapper[4760]: I1204 12:51:04.288286 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" event={"ID":"c5a0aee6-7728-4bcf-8361-93bc45069c7f","Type":"ContainerStarted","Data":"8c8d50033def877ace5799ef7fdedeb0d83527bfeab3e9856eef0111467c49d3"} Dec 04 12:51:04 crc kubenswrapper[4760]: I1204 12:51:04.309416 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" podStartSLOduration=1.883124215 podStartE2EDuration="2.309387006s" podCreationTimestamp="2025-12-04 12:51:02 +0000 UTC" firstStartedPulling="2025-12-04 12:51:03.527791829 +0000 UTC m=+2266.569238396" lastFinishedPulling="2025-12-04 12:51:03.95405462 +0000 UTC m=+2266.995501187" observedRunningTime="2025-12-04 12:51:04.305705808 +0000 UTC m=+2267.347152375" watchObservedRunningTime="2025-12-04 12:51:04.309387006 +0000 UTC m=+2267.350833573" Dec 04 12:51:05 crc kubenswrapper[4760]: I1204 12:51:05.051386 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2mn2p"] Dec 04 12:51:05 crc kubenswrapper[4760]: I1204 12:51:05.071312 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2mn2p"] Dec 04 12:51:05 crc kubenswrapper[4760]: I1204 12:51:05.761971 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:05 crc kubenswrapper[4760]: I1204 12:51:05.762391 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:05 crc kubenswrapper[4760]: I1204 12:51:05.878737 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9dacad-1c52-4236-9b38-e4d91fea47c8" path="/var/lib/kubelet/pods/8b9dacad-1c52-4236-9b38-e4d91fea47c8/volumes" Dec 04 12:51:06 crc kubenswrapper[4760]: I1204 12:51:06.816450 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tfr9b" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="registry-server" probeResult="failure" output=< Dec 04 12:51:06 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 12:51:06 crc kubenswrapper[4760]: > Dec 04 12:51:15 crc kubenswrapper[4760]: I1204 12:51:15.824447 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:15 crc kubenswrapper[4760]: I1204 12:51:15.898713 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:16 crc kubenswrapper[4760]: I1204 12:51:16.061303 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:51:17 crc kubenswrapper[4760]: I1204 12:51:17.418450 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tfr9b" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="registry-server" containerID="cri-o://b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0" gracePeriod=2 Dec 04 12:51:17 crc kubenswrapper[4760]: I1204 12:51:17.943670 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.071514 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities\") pod \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.071609 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqzv5\" (UniqueName: \"kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5\") pod \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.071702 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content\") pod \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\" (UID: \"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4\") " Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.081665 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities" (OuterVolumeSpecName: "utilities") pod "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" (UID: "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.087631 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5" (OuterVolumeSpecName: "kube-api-access-dqzv5") pod "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" (UID: "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4"). InnerVolumeSpecName "kube-api-access-dqzv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.176914 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.176951 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqzv5\" (UniqueName: \"kubernetes.io/projected/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-kube-api-access-dqzv5\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.212854 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" (UID: "ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.279442 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.429800 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerID="b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0" exitCode=0 Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.429860 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerDied","Data":"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0"} Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.429901 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfr9b" event={"ID":"ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4","Type":"ContainerDied","Data":"347080cd419f67d8f20186912046f23b4513f7d51fadcf7a707bba3053fa4c98"} Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.429931 4760 scope.go:117] "RemoveContainer" containerID="b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.430112 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfr9b" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.461005 4760 scope.go:117] "RemoveContainer" containerID="675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.471273 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.487430 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tfr9b"] Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.495159 4760 scope.go:117] "RemoveContainer" containerID="3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.554188 4760 scope.go:117] "RemoveContainer" containerID="b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0" Dec 04 12:51:18 crc kubenswrapper[4760]: E1204 12:51:18.555169 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0\": container with ID starting with b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0 not found: ID does not exist" containerID="b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.555231 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0"} err="failed to get container status \"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0\": rpc error: code = NotFound desc = could not find container \"b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0\": container with ID starting with b739396e40af97bae5ab7a0c03cedc9b4cdc3f9b8bef967ec551baf6487d8cb0 not found: ID does not exist" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.555261 4760 scope.go:117] "RemoveContainer" containerID="675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e" Dec 04 12:51:18 crc kubenswrapper[4760]: E1204 12:51:18.563480 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e\": container with ID starting with 675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e not found: ID does not exist" containerID="675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.563531 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e"} err="failed to get container status \"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e\": rpc error: code = NotFound desc = could not find container \"675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e\": container with ID starting with 675e62c4a038b0d6b19fafa13be42a11de0f8b3998203623b289ef55497bd80e not found: ID does not exist" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.563568 4760 scope.go:117] "RemoveContainer" containerID="3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63" Dec 04 12:51:18 crc kubenswrapper[4760]: E1204 12:51:18.564341 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63\": container with ID starting with 3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63 not found: ID does not exist" containerID="3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63" Dec 04 12:51:18 crc kubenswrapper[4760]: I1204 12:51:18.564389 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63"} err="failed to get container status \"3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63\": rpc error: code = NotFound desc = could not find container \"3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63\": container with ID starting with 3dd913cdf7889677fc3b1bc534be2a0f622743ed617f3bc36d83a98c7031fb63 not found: ID does not exist" Dec 04 12:51:19 crc kubenswrapper[4760]: I1204 12:51:19.881641 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" path="/var/lib/kubelet/pods/ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4/volumes" Dec 04 12:51:33 crc kubenswrapper[4760]: I1204 12:51:33.380042 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:51:33 crc kubenswrapper[4760]: I1204 12:51:33.380530 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.344727 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:35 crc kubenswrapper[4760]: E1204 12:51:35.345704 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="extract-utilities" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.345725 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="extract-utilities" Dec 04 12:51:35 crc kubenswrapper[4760]: E1204 12:51:35.345753 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="registry-server" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.345762 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="registry-server" Dec 04 12:51:35 crc kubenswrapper[4760]: E1204 12:51:35.345821 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="extract-content" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.345834 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="extract-content" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.346119 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3b4b4d-31c3-46b3-a4b1-10b80e08cac4" containerName="registry-server" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.348233 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.357274 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.449466 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.449538 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.449629 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7rsm\" (UniqueName: \"kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.551655 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.551750 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.551812 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7rsm\" (UniqueName: \"kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.552353 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.552380 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.573043 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7rsm\" (UniqueName: \"kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm\") pod \"community-operators-8hm5v\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:35 crc kubenswrapper[4760]: I1204 12:51:35.680534 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:36 crc kubenswrapper[4760]: I1204 12:51:36.224972 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:36 crc kubenswrapper[4760]: I1204 12:51:36.608368 4760 generic.go:334] "Generic (PLEG): container finished" podID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerID="12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d" exitCode=0 Dec 04 12:51:36 crc kubenswrapper[4760]: I1204 12:51:36.608467 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerDied","Data":"12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d"} Dec 04 12:51:36 crc kubenswrapper[4760]: I1204 12:51:36.608684 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerStarted","Data":"e203d34bb5a9d0a80252615e9e3f010c0e444d72e8bc79032f3149ac4a0a64da"} Dec 04 12:51:37 crc kubenswrapper[4760]: I1204 12:51:37.620978 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerStarted","Data":"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f"} Dec 04 12:51:38 crc kubenswrapper[4760]: I1204 12:51:38.635975 4760 generic.go:334] "Generic (PLEG): container finished" podID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerID="64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f" exitCode=0 Dec 04 12:51:38 crc kubenswrapper[4760]: I1204 12:51:38.636033 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerDied","Data":"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f"} Dec 04 12:51:39 crc kubenswrapper[4760]: I1204 12:51:39.649391 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerStarted","Data":"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f"} Dec 04 12:51:39 crc kubenswrapper[4760]: I1204 12:51:39.679296 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hm5v" podStartSLOduration=2.235014434 podStartE2EDuration="4.679249326s" podCreationTimestamp="2025-12-04 12:51:35 +0000 UTC" firstStartedPulling="2025-12-04 12:51:36.610850792 +0000 UTC m=+2299.652297359" lastFinishedPulling="2025-12-04 12:51:39.055085684 +0000 UTC m=+2302.096532251" observedRunningTime="2025-12-04 12:51:39.669532297 +0000 UTC m=+2302.710978864" watchObservedRunningTime="2025-12-04 12:51:39.679249326 +0000 UTC m=+2302.720695893" Dec 04 12:51:45 crc kubenswrapper[4760]: I1204 12:51:45.681867 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:45 crc kubenswrapper[4760]: I1204 12:51:45.682533 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:45 crc kubenswrapper[4760]: I1204 12:51:45.735429 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:45 crc kubenswrapper[4760]: I1204 12:51:45.799016 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:45 crc kubenswrapper[4760]: I1204 12:51:45.975764 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:47 crc kubenswrapper[4760]: I1204 12:51:47.719143 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hm5v" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="registry-server" containerID="cri-o://810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f" gracePeriod=2 Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.198048 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.298838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities\") pod \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.299115 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7rsm\" (UniqueName: \"kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm\") pod \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.299266 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content\") pod \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\" (UID: \"b65f0732-f7b9-43d4-9b64-6789c82f3c73\") " Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.299720 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities" (OuterVolumeSpecName: "utilities") pod "b65f0732-f7b9-43d4-9b64-6789c82f3c73" (UID: "b65f0732-f7b9-43d4-9b64-6789c82f3c73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.300056 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.304535 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm" (OuterVolumeSpecName: "kube-api-access-c7rsm") pod "b65f0732-f7b9-43d4-9b64-6789c82f3c73" (UID: "b65f0732-f7b9-43d4-9b64-6789c82f3c73"). InnerVolumeSpecName "kube-api-access-c7rsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.370259 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b65f0732-f7b9-43d4-9b64-6789c82f3c73" (UID: "b65f0732-f7b9-43d4-9b64-6789c82f3c73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.402609 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7rsm\" (UniqueName: \"kubernetes.io/projected/b65f0732-f7b9-43d4-9b64-6789c82f3c73-kube-api-access-c7rsm\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.402662 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b65f0732-f7b9-43d4-9b64-6789c82f3c73-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.730975 4760 generic.go:334] "Generic (PLEG): container finished" podID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerID="810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f" exitCode=0 Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.731039 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hm5v" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.731060 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerDied","Data":"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f"} Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.731524 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hm5v" event={"ID":"b65f0732-f7b9-43d4-9b64-6789c82f3c73","Type":"ContainerDied","Data":"e203d34bb5a9d0a80252615e9e3f010c0e444d72e8bc79032f3149ac4a0a64da"} Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.731571 4760 scope.go:117] "RemoveContainer" containerID="810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.768760 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.772908 4760 scope.go:117] "RemoveContainer" containerID="64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.783814 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hm5v"] Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.795133 4760 scope.go:117] "RemoveContainer" containerID="12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.845155 4760 scope.go:117] "RemoveContainer" containerID="810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f" Dec 04 12:51:48 crc kubenswrapper[4760]: E1204 12:51:48.845686 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f\": container with ID starting with 810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f not found: ID does not exist" containerID="810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.845752 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f"} err="failed to get container status \"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f\": rpc error: code = NotFound desc = could not find container \"810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f\": container with ID starting with 810863e884ed03d46d368d7a5558131c423a58fa7e91d1acc77d9dee596c946f not found: ID does not exist" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.845793 4760 scope.go:117] "RemoveContainer" containerID="64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f" Dec 04 12:51:48 crc kubenswrapper[4760]: E1204 12:51:48.846425 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f\": container with ID starting with 64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f not found: ID does not exist" containerID="64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.846483 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f"} err="failed to get container status \"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f\": rpc error: code = NotFound desc = could not find container \"64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f\": container with ID starting with 64f9ea9f689580660aac04e922218844349fca2c90528c41027aa10a8732590f not found: ID does not exist" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.846519 4760 scope.go:117] "RemoveContainer" containerID="12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d" Dec 04 12:51:48 crc kubenswrapper[4760]: E1204 12:51:48.846904 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d\": container with ID starting with 12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d not found: ID does not exist" containerID="12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d" Dec 04 12:51:48 crc kubenswrapper[4760]: I1204 12:51:48.846950 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d"} err="failed to get container status \"12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d\": rpc error: code = NotFound desc = could not find container \"12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d\": container with ID starting with 12206ad1ad3c5885948a49ae048e3412baab793eef034eb58d125220f85ee46d not found: ID does not exist" Dec 04 12:51:49 crc kubenswrapper[4760]: I1204 12:51:49.874586 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" path="/var/lib/kubelet/pods/b65f0732-f7b9-43d4-9b64-6789c82f3c73/volumes" Dec 04 12:51:57 crc kubenswrapper[4760]: I1204 12:51:57.819687 4760 generic.go:334] "Generic (PLEG): container finished" podID="c5a0aee6-7728-4bcf-8361-93bc45069c7f" containerID="d7d24762d4e956d7075a39231a946c7d1fd11d49b5b8eb43510c6206e1dc8792" exitCode=0 Dec 04 12:51:57 crc kubenswrapper[4760]: I1204 12:51:57.819815 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" event={"ID":"c5a0aee6-7728-4bcf-8361-93bc45069c7f","Type":"ContainerDied","Data":"d7d24762d4e956d7075a39231a946c7d1fd11d49b5b8eb43510c6206e1dc8792"} Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.342021 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.462268 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory\") pod \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.462586 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key\") pod \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.462663 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbqz\" (UniqueName: \"kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz\") pod \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\" (UID: \"c5a0aee6-7728-4bcf-8361-93bc45069c7f\") " Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.469318 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz" (OuterVolumeSpecName: "kube-api-access-9lbqz") pod "c5a0aee6-7728-4bcf-8361-93bc45069c7f" (UID: "c5a0aee6-7728-4bcf-8361-93bc45069c7f"). InnerVolumeSpecName "kube-api-access-9lbqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.494677 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory" (OuterVolumeSpecName: "inventory") pod "c5a0aee6-7728-4bcf-8361-93bc45069c7f" (UID: "c5a0aee6-7728-4bcf-8361-93bc45069c7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.503636 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5a0aee6-7728-4bcf-8361-93bc45069c7f" (UID: "c5a0aee6-7728-4bcf-8361-93bc45069c7f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.565376 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.565415 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbqz\" (UniqueName: \"kubernetes.io/projected/c5a0aee6-7728-4bcf-8361-93bc45069c7f-kube-api-access-9lbqz\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.565430 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a0aee6-7728-4bcf-8361-93bc45069c7f-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.850064 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" event={"ID":"c5a0aee6-7728-4bcf-8361-93bc45069c7f","Type":"ContainerDied","Data":"8c8d50033def877ace5799ef7fdedeb0d83527bfeab3e9856eef0111467c49d3"} Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.850141 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c8d50033def877ace5799ef7fdedeb0d83527bfeab3e9856eef0111467c49d3" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.850287 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.959814 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8mgzh"] Dec 04 12:51:59 crc kubenswrapper[4760]: E1204 12:51:59.960464 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="extract-utilities" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960486 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="extract-utilities" Dec 04 12:51:59 crc kubenswrapper[4760]: E1204 12:51:59.960506 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="registry-server" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960511 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="registry-server" Dec 04 12:51:59 crc kubenswrapper[4760]: E1204 12:51:59.960524 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a0aee6-7728-4bcf-8361-93bc45069c7f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960532 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a0aee6-7728-4bcf-8361-93bc45069c7f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:59 crc kubenswrapper[4760]: E1204 12:51:59.960541 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="extract-content" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960547 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="extract-content" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960773 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a0aee6-7728-4bcf-8361-93bc45069c7f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.960801 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65f0732-f7b9-43d4-9b64-6789c82f3c73" containerName="registry-server" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.961700 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.966669 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.970879 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.977192 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.977444 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:51:59 crc kubenswrapper[4760]: I1204 12:51:59.993751 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8mgzh"] Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.092170 4760 scope.go:117] "RemoveContainer" containerID="d2bc4a8ede2efb016db7afb420f19b4d99345b2acf77fc91ec1cbd68019c848d" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.094709 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8kww\" (UniqueName: \"kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.094866 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.095265 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.198500 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.199961 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8kww\" (UniqueName: \"kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.200177 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.206438 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.206860 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.221360 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8kww\" (UniqueName: \"kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww\") pod \"ssh-known-hosts-edpm-deployment-8mgzh\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.292998 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:00 crc kubenswrapper[4760]: I1204 12:52:00.867819 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8mgzh"] Dec 04 12:52:01 crc kubenswrapper[4760]: I1204 12:52:01.877284 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" event={"ID":"b297e018-4fcb-40a4-b5f5-3105c4300ae7","Type":"ContainerStarted","Data":"6f0c8be938962ec2d9dea93ea828aa3ad03f9af52986eb817a7ddaa4818706ef"} Dec 04 12:52:01 crc kubenswrapper[4760]: I1204 12:52:01.877678 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" event={"ID":"b297e018-4fcb-40a4-b5f5-3105c4300ae7","Type":"ContainerStarted","Data":"4246466bed4442baacf04be958822b4c762d89d07f6baa92d0162f9e88dfa5d0"} Dec 04 12:52:01 crc kubenswrapper[4760]: I1204 12:52:01.909533 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" podStartSLOduration=2.490522854 podStartE2EDuration="2.909501453s" podCreationTimestamp="2025-12-04 12:51:59 +0000 UTC" firstStartedPulling="2025-12-04 12:52:00.867020693 +0000 UTC m=+2323.908467260" lastFinishedPulling="2025-12-04 12:52:01.285999292 +0000 UTC m=+2324.327445859" observedRunningTime="2025-12-04 12:52:01.900699944 +0000 UTC m=+2324.942146511" watchObservedRunningTime="2025-12-04 12:52:01.909501453 +0000 UTC m=+2324.950948020" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.380956 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.382025 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.382101 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.383565 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.383656 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" gracePeriod=600 Dec 04 12:52:03 crc kubenswrapper[4760]: E1204 12:52:03.520450 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.898570 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" exitCode=0 Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.898633 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f"} Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.898672 4760 scope.go:117] "RemoveContainer" containerID="a7ea646c9740a428c5d1ef09b7a0180e1cf33c493925d62c55b9b9b4a01acb07" Dec 04 12:52:03 crc kubenswrapper[4760]: I1204 12:52:03.899484 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:52:03 crc kubenswrapper[4760]: E1204 12:52:03.899794 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:52:08 crc kubenswrapper[4760]: I1204 12:52:08.953903 4760 generic.go:334] "Generic (PLEG): container finished" podID="b297e018-4fcb-40a4-b5f5-3105c4300ae7" containerID="6f0c8be938962ec2d9dea93ea828aa3ad03f9af52986eb817a7ddaa4818706ef" exitCode=0 Dec 04 12:52:08 crc kubenswrapper[4760]: I1204 12:52:08.954071 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" event={"ID":"b297e018-4fcb-40a4-b5f5-3105c4300ae7","Type":"ContainerDied","Data":"6f0c8be938962ec2d9dea93ea828aa3ad03f9af52986eb817a7ddaa4818706ef"} Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.458545 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.482974 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0\") pod \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.483164 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8kww\" (UniqueName: \"kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww\") pod \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.483226 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam\") pod \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\" (UID: \"b297e018-4fcb-40a4-b5f5-3105c4300ae7\") " Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.490441 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww" (OuterVolumeSpecName: "kube-api-access-m8kww") pod "b297e018-4fcb-40a4-b5f5-3105c4300ae7" (UID: "b297e018-4fcb-40a4-b5f5-3105c4300ae7"). InnerVolumeSpecName "kube-api-access-m8kww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.513728 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b297e018-4fcb-40a4-b5f5-3105c4300ae7" (UID: "b297e018-4fcb-40a4-b5f5-3105c4300ae7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.517009 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b297e018-4fcb-40a4-b5f5-3105c4300ae7" (UID: "b297e018-4fcb-40a4-b5f5-3105c4300ae7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.589407 4760 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.589720 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8kww\" (UniqueName: \"kubernetes.io/projected/b297e018-4fcb-40a4-b5f5-3105c4300ae7-kube-api-access-m8kww\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.589742 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b297e018-4fcb-40a4-b5f5-3105c4300ae7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.977903 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" event={"ID":"b297e018-4fcb-40a4-b5f5-3105c4300ae7","Type":"ContainerDied","Data":"4246466bed4442baacf04be958822b4c762d89d07f6baa92d0162f9e88dfa5d0"} Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.977962 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8mgzh" Dec 04 12:52:10 crc kubenswrapper[4760]: I1204 12:52:10.977969 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4246466bed4442baacf04be958822b4c762d89d07f6baa92d0162f9e88dfa5d0" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.062177 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m"] Dec 04 12:52:11 crc kubenswrapper[4760]: E1204 12:52:11.062711 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b297e018-4fcb-40a4-b5f5-3105c4300ae7" containerName="ssh-known-hosts-edpm-deployment" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.062727 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b297e018-4fcb-40a4-b5f5-3105c4300ae7" containerName="ssh-known-hosts-edpm-deployment" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.062947 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b297e018-4fcb-40a4-b5f5-3105c4300ae7" containerName="ssh-known-hosts-edpm-deployment" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.063859 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.067888 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.068112 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.068198 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.068427 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.081151 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m"] Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.204794 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gfb\" (UniqueName: \"kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.205036 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.205120 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.308371 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gfb\" (UniqueName: \"kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.308443 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.308472 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.314159 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.315267 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.326738 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gfb\" (UniqueName: \"kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wj92m\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.392745 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.964757 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m"] Dec 04 12:52:11 crc kubenswrapper[4760]: I1204 12:52:11.987922 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" event={"ID":"264546ed-074b-4824-8da7-d711ccc821c5","Type":"ContainerStarted","Data":"facf336c5b3d6f7ff2617707b0cbd461ab1d976b866e30cf39f477fb0f3250f1"} Dec 04 12:52:13 crc kubenswrapper[4760]: I1204 12:52:13.000372 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" event={"ID":"264546ed-074b-4824-8da7-d711ccc821c5","Type":"ContainerStarted","Data":"1a06323737ee830740d0ad7c787b13c7169cf4b8c33b753e9d47fe7220ccab5d"} Dec 04 12:52:13 crc kubenswrapper[4760]: I1204 12:52:13.024018 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" podStartSLOduration=1.5093540399999998 podStartE2EDuration="2.023993541s" podCreationTimestamp="2025-12-04 12:52:11 +0000 UTC" firstStartedPulling="2025-12-04 12:52:11.972638328 +0000 UTC m=+2335.014084905" lastFinishedPulling="2025-12-04 12:52:12.487277839 +0000 UTC m=+2335.528724406" observedRunningTime="2025-12-04 12:52:13.016476902 +0000 UTC m=+2336.057923469" watchObservedRunningTime="2025-12-04 12:52:13.023993541 +0000 UTC m=+2336.065440108" Dec 04 12:52:14 crc kubenswrapper[4760]: I1204 12:52:14.864412 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:52:14 crc kubenswrapper[4760]: E1204 12:52:14.864934 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:52:22 crc kubenswrapper[4760]: I1204 12:52:22.092023 4760 generic.go:334] "Generic (PLEG): container finished" podID="264546ed-074b-4824-8da7-d711ccc821c5" containerID="1a06323737ee830740d0ad7c787b13c7169cf4b8c33b753e9d47fe7220ccab5d" exitCode=0 Dec 04 12:52:22 crc kubenswrapper[4760]: I1204 12:52:22.092058 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" event={"ID":"264546ed-074b-4824-8da7-d711ccc821c5","Type":"ContainerDied","Data":"1a06323737ee830740d0ad7c787b13c7169cf4b8c33b753e9d47fe7220ccab5d"} Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.572384 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.665388 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key\") pod \"264546ed-074b-4824-8da7-d711ccc821c5\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.665950 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57gfb\" (UniqueName: \"kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb\") pod \"264546ed-074b-4824-8da7-d711ccc821c5\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.666111 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory\") pod \"264546ed-074b-4824-8da7-d711ccc821c5\" (UID: \"264546ed-074b-4824-8da7-d711ccc821c5\") " Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.670897 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb" (OuterVolumeSpecName: "kube-api-access-57gfb") pod "264546ed-074b-4824-8da7-d711ccc821c5" (UID: "264546ed-074b-4824-8da7-d711ccc821c5"). InnerVolumeSpecName "kube-api-access-57gfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.698941 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory" (OuterVolumeSpecName: "inventory") pod "264546ed-074b-4824-8da7-d711ccc821c5" (UID: "264546ed-074b-4824-8da7-d711ccc821c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.699472 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "264546ed-074b-4824-8da7-d711ccc821c5" (UID: "264546ed-074b-4824-8da7-d711ccc821c5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.768578 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57gfb\" (UniqueName: \"kubernetes.io/projected/264546ed-074b-4824-8da7-d711ccc821c5-kube-api-access-57gfb\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.768624 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:23 crc kubenswrapper[4760]: I1204 12:52:23.768633 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/264546ed-074b-4824-8da7-d711ccc821c5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.114536 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:24 crc kubenswrapper[4760]: E1204 12:52:24.115363 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264546ed-074b-4824-8da7-d711ccc821c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.115388 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="264546ed-074b-4824-8da7-d711ccc821c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.115662 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="264546ed-074b-4824-8da7-d711ccc821c5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.117348 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.126421 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.187549 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs596\" (UniqueName: \"kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.187751 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.188034 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.194350 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" event={"ID":"264546ed-074b-4824-8da7-d711ccc821c5","Type":"ContainerDied","Data":"facf336c5b3d6f7ff2617707b0cbd461ab1d976b866e30cf39f477fb0f3250f1"} Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.194415 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="facf336c5b3d6f7ff2617707b0cbd461ab1d976b866e30cf39f477fb0f3250f1" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.194497 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wj92m" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.241805 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd"] Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.246409 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.250734 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.250975 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.251623 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.251851 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.255922 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd"] Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.295947 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs596\" (UniqueName: \"kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296005 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296061 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296169 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296196 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g2cm\" (UniqueName: \"kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.296811 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.297890 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.318553 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs596\" (UniqueName: \"kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596\") pod \"redhat-marketplace-f7h6c\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.399125 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.399480 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.399509 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g2cm\" (UniqueName: \"kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.403613 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.403712 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.419010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g2cm\" (UniqueName: \"kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.436185 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:24 crc kubenswrapper[4760]: I1204 12:52:24.570097 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:25 crc kubenswrapper[4760]: I1204 12:52:25.031904 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:25 crc kubenswrapper[4760]: I1204 12:52:25.579078 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerStarted","Data":"4fa486c46c6ebbea7ff80071896e0941633e7918f82fde46176ffbd6ae889e63"} Dec 04 12:52:25 crc kubenswrapper[4760]: I1204 12:52:25.663986 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd"] Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.591366 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" event={"ID":"f8e97dd8-8609-4469-a5f8-488c6b3a2098","Type":"ContainerStarted","Data":"b87f46f88d8e93693e54721c6f78813c7cda5f4136db3f8ab2fa52b06c081a64"} Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.591782 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" event={"ID":"f8e97dd8-8609-4469-a5f8-488c6b3a2098","Type":"ContainerStarted","Data":"fc656aad5d599641dec36f55eba8a490766024118aa6fb9d94c0b675820f2d0e"} Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.593242 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerID="694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0" exitCode=0 Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.593281 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerDied","Data":"694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0"} Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.617730 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" podStartSLOduration=2.149099997 podStartE2EDuration="2.617708204s" podCreationTimestamp="2025-12-04 12:52:24 +0000 UTC" firstStartedPulling="2025-12-04 12:52:25.67516422 +0000 UTC m=+2348.716610787" lastFinishedPulling="2025-12-04 12:52:26.143772427 +0000 UTC m=+2349.185218994" observedRunningTime="2025-12-04 12:52:26.610303868 +0000 UTC m=+2349.651750435" watchObservedRunningTime="2025-12-04 12:52:26.617708204 +0000 UTC m=+2349.659154771" Dec 04 12:52:26 crc kubenswrapper[4760]: I1204 12:52:26.865213 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:52:26 crc kubenswrapper[4760]: E1204 12:52:26.866371 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:52:28 crc kubenswrapper[4760]: I1204 12:52:28.650926 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerID="0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3" exitCode=0 Dec 04 12:52:28 crc kubenswrapper[4760]: I1204 12:52:28.651046 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerDied","Data":"0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3"} Dec 04 12:52:30 crc kubenswrapper[4760]: I1204 12:52:30.675437 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerStarted","Data":"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440"} Dec 04 12:52:30 crc kubenswrapper[4760]: I1204 12:52:30.701469 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f7h6c" podStartSLOduration=3.725703967 podStartE2EDuration="6.701437265s" podCreationTimestamp="2025-12-04 12:52:24 +0000 UTC" firstStartedPulling="2025-12-04 12:52:26.594660371 +0000 UTC m=+2349.636106938" lastFinishedPulling="2025-12-04 12:52:29.570393669 +0000 UTC m=+2352.611840236" observedRunningTime="2025-12-04 12:52:30.697622753 +0000 UTC m=+2353.739069340" watchObservedRunningTime="2025-12-04 12:52:30.701437265 +0000 UTC m=+2353.742883832" Dec 04 12:52:34 crc kubenswrapper[4760]: I1204 12:52:34.436993 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:34 crc kubenswrapper[4760]: I1204 12:52:34.437605 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:34 crc kubenswrapper[4760]: I1204 12:52:34.492349 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:34 crc kubenswrapper[4760]: I1204 12:52:34.768091 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:34 crc kubenswrapper[4760]: I1204 12:52:34.819162 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:36 crc kubenswrapper[4760]: I1204 12:52:36.845313 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f7h6c" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="registry-server" containerID="cri-o://5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440" gracePeriod=2 Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.382596 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.534352 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content\") pod \"5c7f5189-be91-491c-840d-92cfb5b67a81\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.534727 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs596\" (UniqueName: \"kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596\") pod \"5c7f5189-be91-491c-840d-92cfb5b67a81\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.534758 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities\") pod \"5c7f5189-be91-491c-840d-92cfb5b67a81\" (UID: \"5c7f5189-be91-491c-840d-92cfb5b67a81\") " Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.535787 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities" (OuterVolumeSpecName: "utilities") pod "5c7f5189-be91-491c-840d-92cfb5b67a81" (UID: "5c7f5189-be91-491c-840d-92cfb5b67a81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.542196 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596" (OuterVolumeSpecName: "kube-api-access-xs596") pod "5c7f5189-be91-491c-840d-92cfb5b67a81" (UID: "5c7f5189-be91-491c-840d-92cfb5b67a81"). InnerVolumeSpecName "kube-api-access-xs596". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.557901 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c7f5189-be91-491c-840d-92cfb5b67a81" (UID: "5c7f5189-be91-491c-840d-92cfb5b67a81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.638004 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs596\" (UniqueName: \"kubernetes.io/projected/5c7f5189-be91-491c-840d-92cfb5b67a81-kube-api-access-xs596\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.638059 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.638069 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c7f5189-be91-491c-840d-92cfb5b67a81-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.858370 4760 generic.go:334] "Generic (PLEG): container finished" podID="f8e97dd8-8609-4469-a5f8-488c6b3a2098" containerID="b87f46f88d8e93693e54721c6f78813c7cda5f4136db3f8ab2fa52b06c081a64" exitCode=0 Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.858455 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" event={"ID":"f8e97dd8-8609-4469-a5f8-488c6b3a2098","Type":"ContainerDied","Data":"b87f46f88d8e93693e54721c6f78813c7cda5f4136db3f8ab2fa52b06c081a64"} Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.863070 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerID="5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440" exitCode=0 Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.863089 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerDied","Data":"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440"} Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.863138 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7h6c" event={"ID":"5c7f5189-be91-491c-840d-92cfb5b67a81","Type":"ContainerDied","Data":"4fa486c46c6ebbea7ff80071896e0941633e7918f82fde46176ffbd6ae889e63"} Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.863182 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7h6c" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.863287 4760 scope.go:117] "RemoveContainer" containerID="5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.864292 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:52:37 crc kubenswrapper[4760]: E1204 12:52:37.864710 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.892670 4760 scope.go:117] "RemoveContainer" containerID="0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3" Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.908417 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:37 crc kubenswrapper[4760]: I1204 12:52:37.920774 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7h6c"] Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.148468 4760 scope.go:117] "RemoveContainer" containerID="694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.206273 4760 scope.go:117] "RemoveContainer" containerID="5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440" Dec 04 12:52:38 crc kubenswrapper[4760]: E1204 12:52:38.207070 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440\": container with ID starting with 5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440 not found: ID does not exist" containerID="5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.207116 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440"} err="failed to get container status \"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440\": rpc error: code = NotFound desc = could not find container \"5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440\": container with ID starting with 5b4af0f6a1b872e12b83a7258911fc985f7534d2862706804c3aecba6a9cb440 not found: ID does not exist" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.207184 4760 scope.go:117] "RemoveContainer" containerID="0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3" Dec 04 12:52:38 crc kubenswrapper[4760]: E1204 12:52:38.207647 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3\": container with ID starting with 0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3 not found: ID does not exist" containerID="0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.207674 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3"} err="failed to get container status \"0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3\": rpc error: code = NotFound desc = could not find container \"0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3\": container with ID starting with 0b61388c64c0a5402a75a01c1d23211ff62267e6f37b65f31ff375ecac4b9bf3 not found: ID does not exist" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.207687 4760 scope.go:117] "RemoveContainer" containerID="694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0" Dec 04 12:52:38 crc kubenswrapper[4760]: E1204 12:52:38.208169 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0\": container with ID starting with 694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0 not found: ID does not exist" containerID="694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0" Dec 04 12:52:38 crc kubenswrapper[4760]: I1204 12:52:38.208270 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0"} err="failed to get container status \"694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0\": rpc error: code = NotFound desc = could not find container \"694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0\": container with ID starting with 694f68a99ad3f51eda804f613dab376cb654a59db2e2a65ed69b2e70054916d0 not found: ID does not exist" Dec 04 12:52:38 crc kubenswrapper[4760]: E1204 12:52:38.319529 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7f5189_be91_491c_840d_92cfb5b67a81.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7f5189_be91_491c_840d_92cfb5b67a81.slice/crio-4fa486c46c6ebbea7ff80071896e0941633e7918f82fde46176ffbd6ae889e63\": RecentStats: unable to find data in memory cache]" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.321726 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.485453 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g2cm\" (UniqueName: \"kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm\") pod \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.485827 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key\") pod \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.486477 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory\") pod \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\" (UID: \"f8e97dd8-8609-4469-a5f8-488c6b3a2098\") " Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.492276 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm" (OuterVolumeSpecName: "kube-api-access-9g2cm") pod "f8e97dd8-8609-4469-a5f8-488c6b3a2098" (UID: "f8e97dd8-8609-4469-a5f8-488c6b3a2098"). InnerVolumeSpecName "kube-api-access-9g2cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.518371 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f8e97dd8-8609-4469-a5f8-488c6b3a2098" (UID: "f8e97dd8-8609-4469-a5f8-488c6b3a2098"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.524194 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory" (OuterVolumeSpecName: "inventory") pod "f8e97dd8-8609-4469-a5f8-488c6b3a2098" (UID: "f8e97dd8-8609-4469-a5f8-488c6b3a2098"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.590914 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.591161 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8e97dd8-8609-4469-a5f8-488c6b3a2098-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.591320 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g2cm\" (UniqueName: \"kubernetes.io/projected/f8e97dd8-8609-4469-a5f8-488c6b3a2098-kube-api-access-9g2cm\") on node \"crc\" DevicePath \"\"" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.878480 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" path="/var/lib/kubelet/pods/5c7f5189-be91-491c-840d-92cfb5b67a81/volumes" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.884582 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" event={"ID":"f8e97dd8-8609-4469-a5f8-488c6b3a2098","Type":"ContainerDied","Data":"fc656aad5d599641dec36f55eba8a490766024118aa6fb9d94c0b675820f2d0e"} Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.884639 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc656aad5d599641dec36f55eba8a490766024118aa6fb9d94c0b675820f2d0e" Dec 04 12:52:39 crc kubenswrapper[4760]: I1204 12:52:39.884664 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.059233 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t"] Dec 04 12:52:40 crc kubenswrapper[4760]: E1204 12:52:40.059910 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e97dd8-8609-4469-a5f8-488c6b3a2098" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.059937 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e97dd8-8609-4469-a5f8-488c6b3a2098" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:40 crc kubenswrapper[4760]: E1204 12:52:40.059966 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="extract-utilities" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.059974 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="extract-utilities" Dec 04 12:52:40 crc kubenswrapper[4760]: E1204 12:52:40.060010 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="extract-content" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.060016 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="extract-content" Dec 04 12:52:40 crc kubenswrapper[4760]: E1204 12:52:40.060033 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="registry-server" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.060039 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="registry-server" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.060360 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e97dd8-8609-4469-a5f8-488c6b3a2098" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.060387 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7f5189-be91-491c-840d-92cfb5b67a81" containerName="registry-server" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.063394 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.072429 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t"] Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.074078 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.075197 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.075452 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.075579 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.075763 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.075910 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.076281 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.076551 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105376 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105533 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105603 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105655 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105691 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105774 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105812 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105868 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8mwc\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105896 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105941 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.105969 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.106111 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.106362 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.106509 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208606 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208749 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208816 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208867 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208916 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.208959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209034 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209096 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209339 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209461 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8mwc\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209516 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209607 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209650 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.209733 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.214663 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.215121 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.215230 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.215352 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.216703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.217367 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.217691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.218085 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.218091 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.219112 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.220619 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.221107 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.230659 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.231077 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8mwc\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6z22t\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:40 crc kubenswrapper[4760]: I1204 12:52:40.414147 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:52:41 crc kubenswrapper[4760]: I1204 12:52:41.300974 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t"] Dec 04 12:52:42 crc kubenswrapper[4760]: I1204 12:52:42.162792 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" event={"ID":"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27","Type":"ContainerStarted","Data":"80e8f6227be3b1a55273a12cbe157feaed2a1ac0cad9a02e6c255b31f0fd38d6"} Dec 04 12:52:42 crc kubenswrapper[4760]: I1204 12:52:42.195169 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" podStartSLOduration=1.624244077 podStartE2EDuration="2.195145316s" podCreationTimestamp="2025-12-04 12:52:40 +0000 UTC" firstStartedPulling="2025-12-04 12:52:41.311842687 +0000 UTC m=+2364.353289254" lastFinishedPulling="2025-12-04 12:52:41.882743926 +0000 UTC m=+2364.924190493" observedRunningTime="2025-12-04 12:52:42.187921517 +0000 UTC m=+2365.229368114" watchObservedRunningTime="2025-12-04 12:52:42.195145316 +0000 UTC m=+2365.236591873" Dec 04 12:52:43 crc kubenswrapper[4760]: I1204 12:52:43.297810 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" event={"ID":"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27","Type":"ContainerStarted","Data":"f09b7f4ecae2efce322051274404a968ee03294273fb4b4f95536913dac07cc7"} Dec 04 12:52:48 crc kubenswrapper[4760]: I1204 12:52:48.864384 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:52:48 crc kubenswrapper[4760]: E1204 12:52:48.865067 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:53:02 crc kubenswrapper[4760]: I1204 12:53:02.865136 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:53:02 crc kubenswrapper[4760]: E1204 12:53:02.865994 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:53:15 crc kubenswrapper[4760]: I1204 12:53:15.865144 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:53:15 crc kubenswrapper[4760]: E1204 12:53:15.866163 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:53:21 crc kubenswrapper[4760]: I1204 12:53:21.923721 4760 generic.go:334] "Generic (PLEG): container finished" podID="d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" containerID="f09b7f4ecae2efce322051274404a968ee03294273fb4b4f95536913dac07cc7" exitCode=0 Dec 04 12:53:21 crc kubenswrapper[4760]: I1204 12:53:21.923811 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" event={"ID":"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27","Type":"ContainerDied","Data":"f09b7f4ecae2efce322051274404a968ee03294273fb4b4f95536913dac07cc7"} Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.378872 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499406 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499487 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499546 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499623 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499697 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499715 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499750 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499781 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8mwc\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499890 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499919 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.499981 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.500007 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.500039 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.500072 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle\") pod \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\" (UID: \"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27\") " Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.504931 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.506681 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.507132 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc" (OuterVolumeSpecName: "kube-api-access-c8mwc") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "kube-api-access-c8mwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.507178 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.507568 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.507609 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.508711 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.508895 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.508924 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.510869 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.511431 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.511630 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.534344 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.546984 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory" (OuterVolumeSpecName: "inventory") pod "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" (UID: "d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603475 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603510 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603520 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603533 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603545 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603554 4760 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603566 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603577 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603587 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603659 4760 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603671 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603704 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8mwc\" (UniqueName: \"kubernetes.io/projected/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-kube-api-access-c8mwc\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603713 4760 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.603722 4760 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.948794 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" event={"ID":"d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27","Type":"ContainerDied","Data":"80e8f6227be3b1a55273a12cbe157feaed2a1ac0cad9a02e6c255b31f0fd38d6"} Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.948846 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e8f6227be3b1a55273a12cbe157feaed2a1ac0cad9a02e6c255b31f0fd38d6" Dec 04 12:53:23 crc kubenswrapper[4760]: I1204 12:53:23.948919 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6z22t" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.069676 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt"] Dec 04 12:53:24 crc kubenswrapper[4760]: E1204 12:53:24.070397 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.070422 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.070705 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.071658 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.074774 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.075438 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.076311 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.077223 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.078100 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.085810 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt"] Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.215992 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.216372 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7h6\" (UniqueName: \"kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.216535 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.216589 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.216776 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.318962 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.319085 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.319127 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7h6\" (UniqueName: \"kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.319258 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.319322 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.319972 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.323704 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.323719 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.323919 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.338110 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7h6\" (UniqueName: \"kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fg4xt\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.397413 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:53:24 crc kubenswrapper[4760]: I1204 12:53:24.991405 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt"] Dec 04 12:53:25 crc kubenswrapper[4760]: I1204 12:53:25.969057 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" event={"ID":"aec90918-8692-4e3d-ba94-7b8e358b8f60","Type":"ContainerStarted","Data":"f76693498ec3562ac7e1d3549dd7cb6c0344afdecc89c2b3f719c68c816807d8"} Dec 04 12:53:25 crc kubenswrapper[4760]: I1204 12:53:25.970306 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" event={"ID":"aec90918-8692-4e3d-ba94-7b8e358b8f60","Type":"ContainerStarted","Data":"e14f44750d08c82b3450951e5c385406e85fa04c30dbc5f966e80f6a59c9d810"} Dec 04 12:53:25 crc kubenswrapper[4760]: I1204 12:53:25.996023 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" podStartSLOduration=1.5150231889999999 podStartE2EDuration="1.995996819s" podCreationTimestamp="2025-12-04 12:53:24 +0000 UTC" firstStartedPulling="2025-12-04 12:53:25.000045228 +0000 UTC m=+2408.041491795" lastFinishedPulling="2025-12-04 12:53:25.481018858 +0000 UTC m=+2408.522465425" observedRunningTime="2025-12-04 12:53:25.991499026 +0000 UTC m=+2409.032945603" watchObservedRunningTime="2025-12-04 12:53:25.995996819 +0000 UTC m=+2409.037443386" Dec 04 12:53:27 crc kubenswrapper[4760]: I1204 12:53:27.873296 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:53:27 crc kubenswrapper[4760]: E1204 12:53:27.874192 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:53:41 crc kubenswrapper[4760]: I1204 12:53:41.931665 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:53:41 crc kubenswrapper[4760]: E1204 12:53:41.936968 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:53:52 crc kubenswrapper[4760]: I1204 12:53:52.864860 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:53:52 crc kubenswrapper[4760]: E1204 12:53:52.865853 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:54:06 crc kubenswrapper[4760]: I1204 12:54:06.865883 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:54:06 crc kubenswrapper[4760]: E1204 12:54:06.868359 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:54:19 crc kubenswrapper[4760]: I1204 12:54:19.864671 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:54:19 crc kubenswrapper[4760]: E1204 12:54:19.865276 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:54:32 crc kubenswrapper[4760]: I1204 12:54:32.319791 4760 generic.go:334] "Generic (PLEG): container finished" podID="aec90918-8692-4e3d-ba94-7b8e358b8f60" containerID="f76693498ec3562ac7e1d3549dd7cb6c0344afdecc89c2b3f719c68c816807d8" exitCode=0 Dec 04 12:54:32 crc kubenswrapper[4760]: I1204 12:54:32.319873 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" event={"ID":"aec90918-8692-4e3d-ba94-7b8e358b8f60","Type":"ContainerDied","Data":"f76693498ec3562ac7e1d3549dd7cb6c0344afdecc89c2b3f719c68c816807d8"} Dec 04 12:54:33 crc kubenswrapper[4760]: I1204 12:54:33.998457 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.067265 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0\") pod \"aec90918-8692-4e3d-ba94-7b8e358b8f60\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.067459 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key\") pod \"aec90918-8692-4e3d-ba94-7b8e358b8f60\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.067587 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory\") pod \"aec90918-8692-4e3d-ba94-7b8e358b8f60\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.067615 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv7h6\" (UniqueName: \"kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6\") pod \"aec90918-8692-4e3d-ba94-7b8e358b8f60\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.067684 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle\") pod \"aec90918-8692-4e3d-ba94-7b8e358b8f60\" (UID: \"aec90918-8692-4e3d-ba94-7b8e358b8f60\") " Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.073875 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6" (OuterVolumeSpecName: "kube-api-access-sv7h6") pod "aec90918-8692-4e3d-ba94-7b8e358b8f60" (UID: "aec90918-8692-4e3d-ba94-7b8e358b8f60"). InnerVolumeSpecName "kube-api-access-sv7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.079379 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "aec90918-8692-4e3d-ba94-7b8e358b8f60" (UID: "aec90918-8692-4e3d-ba94-7b8e358b8f60"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.096334 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "aec90918-8692-4e3d-ba94-7b8e358b8f60" (UID: "aec90918-8692-4e3d-ba94-7b8e358b8f60"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.097588 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aec90918-8692-4e3d-ba94-7b8e358b8f60" (UID: "aec90918-8692-4e3d-ba94-7b8e358b8f60"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.098040 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory" (OuterVolumeSpecName: "inventory") pod "aec90918-8692-4e3d-ba94-7b8e358b8f60" (UID: "aec90918-8692-4e3d-ba94-7b8e358b8f60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.169373 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.169400 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv7h6\" (UniqueName: \"kubernetes.io/projected/aec90918-8692-4e3d-ba94-7b8e358b8f60-kube-api-access-sv7h6\") on node \"crc\" DevicePath \"\"" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.169410 4760 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.169418 4760 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aec90918-8692-4e3d-ba94-7b8e358b8f60-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.169427 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aec90918-8692-4e3d-ba94-7b8e358b8f60-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.339952 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" event={"ID":"aec90918-8692-4e3d-ba94-7b8e358b8f60","Type":"ContainerDied","Data":"e14f44750d08c82b3450951e5c385406e85fa04c30dbc5f966e80f6a59c9d810"} Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.340027 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fg4xt" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.340054 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e14f44750d08c82b3450951e5c385406e85fa04c30dbc5f966e80f6a59c9d810" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.440219 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6"] Dec 04 12:54:34 crc kubenswrapper[4760]: E1204 12:54:34.440980 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec90918-8692-4e3d-ba94-7b8e358b8f60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.441098 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec90918-8692-4e3d-ba94-7b8e358b8f60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.441453 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec90918-8692-4e3d-ba94-7b8e358b8f60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.442494 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.453624 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.454657 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.454858 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.455029 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.455143 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.455274 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.462110 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6"] Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.483298 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.483750 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z579g\" (UniqueName: \"kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.484093 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.484341 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.484552 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.484744 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587577 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587660 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z579g\" (UniqueName: \"kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587744 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587775 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.587796 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.591682 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.592111 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.592488 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.593711 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.598480 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.609083 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z579g\" (UniqueName: \"kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.760123 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:54:34 crc kubenswrapper[4760]: I1204 12:54:34.865606 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:54:34 crc kubenswrapper[4760]: E1204 12:54:34.866076 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:54:35 crc kubenswrapper[4760]: I1204 12:54:35.316034 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6"] Dec 04 12:54:35 crc kubenswrapper[4760]: I1204 12:54:35.335886 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:54:35 crc kubenswrapper[4760]: I1204 12:54:35.351755 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" event={"ID":"4e5c739f-fcc7-4384-b7e0-302daee90091","Type":"ContainerStarted","Data":"0a77821eb16e3e204bd489d678bc649f83800ae0c0206e96ad871368c386b12c"} Dec 04 12:54:36 crc kubenswrapper[4760]: I1204 12:54:36.368919 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" event={"ID":"4e5c739f-fcc7-4384-b7e0-302daee90091","Type":"ContainerStarted","Data":"5eae589082d2324f7623146dd992d0a6c84bcf28001bde53809e5aef7c8dfed7"} Dec 04 12:54:36 crc kubenswrapper[4760]: I1204 12:54:36.391283 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" podStartSLOduration=1.67343246 podStartE2EDuration="2.391241528s" podCreationTimestamp="2025-12-04 12:54:34 +0000 UTC" firstStartedPulling="2025-12-04 12:54:35.335526108 +0000 UTC m=+2478.376972675" lastFinishedPulling="2025-12-04 12:54:36.053335166 +0000 UTC m=+2479.094781743" observedRunningTime="2025-12-04 12:54:36.389821382 +0000 UTC m=+2479.431267949" watchObservedRunningTime="2025-12-04 12:54:36.391241528 +0000 UTC m=+2479.432688095" Dec 04 12:54:48 crc kubenswrapper[4760]: I1204 12:54:48.865277 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:54:48 crc kubenswrapper[4760]: E1204 12:54:48.866231 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:55:03 crc kubenswrapper[4760]: I1204 12:55:03.864451 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:55:03 crc kubenswrapper[4760]: E1204 12:55:03.865339 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:55:15 crc kubenswrapper[4760]: I1204 12:55:15.865384 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:55:15 crc kubenswrapper[4760]: E1204 12:55:15.866243 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:55:26 crc kubenswrapper[4760]: I1204 12:55:26.864910 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:55:26 crc kubenswrapper[4760]: E1204 12:55:26.865792 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:55:31 crc kubenswrapper[4760]: I1204 12:55:31.900749 4760 generic.go:334] "Generic (PLEG): container finished" podID="4e5c739f-fcc7-4384-b7e0-302daee90091" containerID="5eae589082d2324f7623146dd992d0a6c84bcf28001bde53809e5aef7c8dfed7" exitCode=0 Dec 04 12:55:31 crc kubenswrapper[4760]: I1204 12:55:31.900832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" event={"ID":"4e5c739f-fcc7-4384-b7e0-302daee90091","Type":"ContainerDied","Data":"5eae589082d2324f7623146dd992d0a6c84bcf28001bde53809e5aef7c8dfed7"} Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.347007 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.512360 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.512435 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.512485 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.512519 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.512676 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z579g\" (UniqueName: \"kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.513757 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle\") pod \"4e5c739f-fcc7-4384-b7e0-302daee90091\" (UID: \"4e5c739f-fcc7-4384-b7e0-302daee90091\") " Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.518663 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g" (OuterVolumeSpecName: "kube-api-access-z579g") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "kube-api-access-z579g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.519374 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.544668 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.545381 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.545768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory" (OuterVolumeSpecName: "inventory") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.552521 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4e5c739f-fcc7-4384-b7e0-302daee90091" (UID: "4e5c739f-fcc7-4384-b7e0-302daee90091"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618819 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618860 4760 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618873 4760 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618882 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618891 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e5c739f-fcc7-4384-b7e0-302daee90091-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.618899 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z579g\" (UniqueName: \"kubernetes.io/projected/4e5c739f-fcc7-4384-b7e0-302daee90091-kube-api-access-z579g\") on node \"crc\" DevicePath \"\"" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.919935 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" event={"ID":"4e5c739f-fcc7-4384-b7e0-302daee90091","Type":"ContainerDied","Data":"0a77821eb16e3e204bd489d678bc649f83800ae0c0206e96ad871368c386b12c"} Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.920262 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a77821eb16e3e204bd489d678bc649f83800ae0c0206e96ad871368c386b12c" Dec 04 12:55:33 crc kubenswrapper[4760]: I1204 12:55:33.920002 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.117512 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x"] Dec 04 12:55:34 crc kubenswrapper[4760]: E1204 12:55:34.118020 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5c739f-fcc7-4384-b7e0-302daee90091" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.118044 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5c739f-fcc7-4384-b7e0-302daee90091" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.118683 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5c739f-fcc7-4384-b7e0-302daee90091" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.119423 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.122433 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.122453 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.122497 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.122616 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.122840 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.126393 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.126566 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.126768 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvngv\" (UniqueName: \"kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.126849 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.127246 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.139177 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x"] Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.229065 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.229130 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.229192 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.229279 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvngv\" (UniqueName: \"kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.229311 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.234792 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.236225 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.236411 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.246722 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.247175 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvngv\" (UniqueName: \"kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.437799 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:55:34 crc kubenswrapper[4760]: W1204 12:55:34.971182 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2dcfb70_5791_401e_a7d3_cec6bf1f4dba.slice/crio-4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139 WatchSource:0}: Error finding container 4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139: Status 404 returned error can't find the container with id 4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139 Dec 04 12:55:34 crc kubenswrapper[4760]: I1204 12:55:34.980110 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x"] Dec 04 12:55:35 crc kubenswrapper[4760]: I1204 12:55:35.940635 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" event={"ID":"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba","Type":"ContainerStarted","Data":"4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139"} Dec 04 12:55:36 crc kubenswrapper[4760]: I1204 12:55:36.953000 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" event={"ID":"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba","Type":"ContainerStarted","Data":"b699ef94fda136395ff15ed0c70f96f7c7b1edfeb0a39fc512d2f05a30b6da07"} Dec 04 12:55:36 crc kubenswrapper[4760]: I1204 12:55:36.978922 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" podStartSLOduration=1.404871528 podStartE2EDuration="2.978901207s" podCreationTimestamp="2025-12-04 12:55:34 +0000 UTC" firstStartedPulling="2025-12-04 12:55:34.974115255 +0000 UTC m=+2538.015561822" lastFinishedPulling="2025-12-04 12:55:36.548144934 +0000 UTC m=+2539.589591501" observedRunningTime="2025-12-04 12:55:36.974175817 +0000 UTC m=+2540.015622384" watchObservedRunningTime="2025-12-04 12:55:36.978901207 +0000 UTC m=+2540.020347774" Dec 04 12:55:41 crc kubenswrapper[4760]: I1204 12:55:41.864778 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:55:41 crc kubenswrapper[4760]: E1204 12:55:41.865676 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:55:54 crc kubenswrapper[4760]: I1204 12:55:54.863977 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:55:54 crc kubenswrapper[4760]: E1204 12:55:54.864859 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:56:05 crc kubenswrapper[4760]: I1204 12:56:05.868044 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:56:05 crc kubenswrapper[4760]: E1204 12:56:05.869174 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:56:17 crc kubenswrapper[4760]: I1204 12:56:17.879864 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:56:17 crc kubenswrapper[4760]: E1204 12:56:17.880787 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:56:30 crc kubenswrapper[4760]: I1204 12:56:30.864640 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:56:30 crc kubenswrapper[4760]: E1204 12:56:30.865412 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:56:42 crc kubenswrapper[4760]: I1204 12:56:42.273464 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-58wzk" podUID="2623f14b-9edc-48cd-aeba-08cc1155890f" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 12:56:45 crc kubenswrapper[4760]: I1204 12:56:45.865319 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:56:45 crc kubenswrapper[4760]: E1204 12:56:45.866155 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:56:59 crc kubenswrapper[4760]: I1204 12:56:59.864753 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:56:59 crc kubenswrapper[4760]: E1204 12:56:59.865841 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 12:57:10 crc kubenswrapper[4760]: I1204 12:57:10.864810 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 12:57:11 crc kubenswrapper[4760]: I1204 12:57:11.128462 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e"} Dec 04 12:59:33 crc kubenswrapper[4760]: I1204 12:59:33.381115 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 12:59:33 crc kubenswrapper[4760]: I1204 12:59:33.381683 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 12:59:56 crc kubenswrapper[4760]: I1204 12:59:56.728531 4760 generic.go:334] "Generic (PLEG): container finished" podID="e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" containerID="b699ef94fda136395ff15ed0c70f96f7c7b1edfeb0a39fc512d2f05a30b6da07" exitCode=0 Dec 04 12:59:56 crc kubenswrapper[4760]: I1204 12:59:56.728624 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" event={"ID":"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba","Type":"ContainerDied","Data":"b699ef94fda136395ff15ed0c70f96f7c7b1edfeb0a39fc512d2f05a30b6da07"} Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.170034 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.270963 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvngv\" (UniqueName: \"kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv\") pod \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.271084 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0\") pod \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.271188 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle\") pod \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.271269 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory\") pod \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.271416 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key\") pod \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\" (UID: \"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba\") " Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.277928 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv" (OuterVolumeSpecName: "kube-api-access-gvngv") pod "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" (UID: "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba"). InnerVolumeSpecName "kube-api-access-gvngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.278579 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" (UID: "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.302740 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" (UID: "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.303573 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory" (OuterVolumeSpecName: "inventory") pod "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" (UID: "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.307852 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" (UID: "e2dcfb70-5791-401e-a7d3-cec6bf1f4dba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.374419 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.374460 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvngv\" (UniqueName: \"kubernetes.io/projected/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-kube-api-access-gvngv\") on node \"crc\" DevicePath \"\"" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.374474 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.374512 4760 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.374521 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2dcfb70-5791-401e-a7d3-cec6bf1f4dba-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.747698 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" event={"ID":"e2dcfb70-5791-401e-a7d3-cec6bf1f4dba","Type":"ContainerDied","Data":"4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139"} Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.747743 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e64263c547977837d9a5446f6b00abe49df73c4913182e057b60cda1f053139" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.747804 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.857489 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275"] Dec 04 12:59:58 crc kubenswrapper[4760]: E1204 12:59:58.857929 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.857942 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.858159 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2dcfb70-5791-401e-a7d3-cec6bf1f4dba" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.858873 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.861913 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.861949 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.862180 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.862894 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.863340 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.863355 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.863514 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.874915 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275"] Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.994727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.994940 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995082 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x8hx\" (UniqueName: \"kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995318 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995588 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995628 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995727 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995807 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:58 crc kubenswrapper[4760]: I1204 12:59:58.995976 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.098984 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099077 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099126 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x8hx\" (UniqueName: \"kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099207 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099317 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099352 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099404 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099430 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.099551 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.101354 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.104145 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.104448 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.104655 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.104682 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.105343 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.116039 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.117027 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.117989 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x8hx\" (UniqueName: \"kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fx275\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.177631 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.731363 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.732316 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275"] Dec 04 12:59:59 crc kubenswrapper[4760]: I1204 12:59:59.761308 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" event={"ID":"b8bba20c-b75b-40da-98ad-436a4d121d13","Type":"ContainerStarted","Data":"fca8f6e8698e692b1b59f712d14070e4a5d6a70fe87cf0e01929a1564398d2aa"} Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.154221 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf"] Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.156412 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.161367 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.163606 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.167725 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf"] Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.236963 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.238814 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzcm2\" (UniqueName: \"kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.239038 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.341753 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzcm2\" (UniqueName: \"kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.342175 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.342475 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.343731 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.358010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.362114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzcm2\" (UniqueName: \"kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2\") pod \"collect-profiles-29414220-4jjgf\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.422268 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.700473 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf"] Dec 04 13:00:00 crc kubenswrapper[4760]: W1204 13:00:00.716555 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258b2022_1b3b_4fc8_a8fb_fa19b450eb98.slice/crio-ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7 WatchSource:0}: Error finding container ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7: Status 404 returned error can't find the container with id ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7 Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.781355 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" event={"ID":"b8bba20c-b75b-40da-98ad-436a4d121d13","Type":"ContainerStarted","Data":"0d623ba0e48388c48659430a6296dc295cb6806d8f253a4f545d87dd78d15d5e"} Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.783461 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" event={"ID":"258b2022-1b3b-4fc8-a8fb-fa19b450eb98","Type":"ContainerStarted","Data":"ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7"} Dec 04 13:00:00 crc kubenswrapper[4760]: I1204 13:00:00.812209 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" podStartSLOduration=2.378816136 podStartE2EDuration="2.812178969s" podCreationTimestamp="2025-12-04 12:59:58 +0000 UTC" firstStartedPulling="2025-12-04 12:59:59.730962856 +0000 UTC m=+2802.772409423" lastFinishedPulling="2025-12-04 13:00:00.164325689 +0000 UTC m=+2803.205772256" observedRunningTime="2025-12-04 13:00:00.802073787 +0000 UTC m=+2803.843520374" watchObservedRunningTime="2025-12-04 13:00:00.812178969 +0000 UTC m=+2803.853625546" Dec 04 13:00:01 crc kubenswrapper[4760]: I1204 13:00:01.794161 4760 generic.go:334] "Generic (PLEG): container finished" podID="258b2022-1b3b-4fc8-a8fb-fa19b450eb98" containerID="5ed4b9c46ab431b86548ccc97f133ee0f1e1c1b44af1316faa545d3adc41d5a9" exitCode=0 Dec 04 13:00:01 crc kubenswrapper[4760]: I1204 13:00:01.794338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" event={"ID":"258b2022-1b3b-4fc8-a8fb-fa19b450eb98","Type":"ContainerDied","Data":"5ed4b9c46ab431b86548ccc97f133ee0f1e1c1b44af1316faa545d3adc41d5a9"} Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.210865 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.229298 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume\") pod \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.229345 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume\") pod \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.229588 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzcm2\" (UniqueName: \"kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2\") pod \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\" (UID: \"258b2022-1b3b-4fc8-a8fb-fa19b450eb98\") " Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.230387 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume" (OuterVolumeSpecName: "config-volume") pod "258b2022-1b3b-4fc8-a8fb-fa19b450eb98" (UID: "258b2022-1b3b-4fc8-a8fb-fa19b450eb98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.230709 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.234902 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "258b2022-1b3b-4fc8-a8fb-fa19b450eb98" (UID: "258b2022-1b3b-4fc8-a8fb-fa19b450eb98"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.235421 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2" (OuterVolumeSpecName: "kube-api-access-vzcm2") pod "258b2022-1b3b-4fc8-a8fb-fa19b450eb98" (UID: "258b2022-1b3b-4fc8-a8fb-fa19b450eb98"). InnerVolumeSpecName "kube-api-access-vzcm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.332466 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.332508 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzcm2\" (UniqueName: \"kubernetes.io/projected/258b2022-1b3b-4fc8-a8fb-fa19b450eb98-kube-api-access-vzcm2\") on node \"crc\" DevicePath \"\"" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.380151 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.380346 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.815169 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" event={"ID":"258b2022-1b3b-4fc8-a8fb-fa19b450eb98","Type":"ContainerDied","Data":"ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7"} Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.815244 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef718b9b5bc2b748eca2d556337923fe8841e5b1b5c8a49b56fa5c793226aee7" Dec 04 13:00:03 crc kubenswrapper[4760]: I1204 13:00:03.815264 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf" Dec 04 13:00:04 crc kubenswrapper[4760]: I1204 13:00:04.288567 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx"] Dec 04 13:00:04 crc kubenswrapper[4760]: I1204 13:00:04.297748 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414175-9zcnx"] Dec 04 13:00:05 crc kubenswrapper[4760]: I1204 13:00:05.880454 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd54983-b7fb-4b28-93fd-d2b9d5b881f0" path="/var/lib/kubelet/pods/dbd54983-b7fb-4b28-93fd-d2b9d5b881f0/volumes" Dec 04 13:00:33 crc kubenswrapper[4760]: I1204 13:00:33.380795 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:00:33 crc kubenswrapper[4760]: I1204 13:00:33.381323 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:00:33 crc kubenswrapper[4760]: I1204 13:00:33.381367 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:00:33 crc kubenswrapper[4760]: I1204 13:00:33.382297 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:00:33 crc kubenswrapper[4760]: I1204 13:00:33.382356 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e" gracePeriod=600 Dec 04 13:00:34 crc kubenswrapper[4760]: I1204 13:00:34.086201 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e" exitCode=0 Dec 04 13:00:34 crc kubenswrapper[4760]: I1204 13:00:34.086247 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e"} Dec 04 13:00:34 crc kubenswrapper[4760]: I1204 13:00:34.086574 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184"} Dec 04 13:00:34 crc kubenswrapper[4760]: I1204 13:00:34.086599 4760 scope.go:117] "RemoveContainer" containerID="a26ecf8083ea2b0d353f1535d11351ed5bf0453f6dfd5394f7f259243a216b8f" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.163013 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414221-d4ff5"] Dec 04 13:01:00 crc kubenswrapper[4760]: E1204 13:01:00.164155 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258b2022-1b3b-4fc8-a8fb-fa19b450eb98" containerName="collect-profiles" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.164549 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="258b2022-1b3b-4fc8-a8fb-fa19b450eb98" containerName="collect-profiles" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.164898 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="258b2022-1b3b-4fc8-a8fb-fa19b450eb98" containerName="collect-profiles" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.166330 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.185053 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414221-d4ff5"] Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.259465 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hr78\" (UniqueName: \"kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.259879 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.260116 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.260223 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.362904 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hr78\" (UniqueName: \"kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.363070 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.363153 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.363276 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.525797 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.526537 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.535775 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.537376 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hr78\" (UniqueName: \"kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78\") pod \"keystone-cron-29414221-d4ff5\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.575356 4760 scope.go:117] "RemoveContainer" containerID="9bf669ac9023994bb14325c673087aa4cb6a2eb0c475d77c77268754cbdb8f61" Dec 04 13:01:00 crc kubenswrapper[4760]: I1204 13:01:00.808026 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:01 crc kubenswrapper[4760]: I1204 13:01:01.366053 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414221-d4ff5"] Dec 04 13:01:02 crc kubenswrapper[4760]: I1204 13:01:02.353470 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414221-d4ff5" event={"ID":"a5b9d885-9739-498a-bd0e-fd78e0d5c779","Type":"ContainerStarted","Data":"b5e05ed66314b2016fbc22474d19b60b6f97573141a3d12a295d40e43b395598"} Dec 04 13:01:02 crc kubenswrapper[4760]: I1204 13:01:02.354033 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414221-d4ff5" event={"ID":"a5b9d885-9739-498a-bd0e-fd78e0d5c779","Type":"ContainerStarted","Data":"abcfafd8b19b5c1b1e5c2e6ec5a01d697447c5e3d68f54094fb97152a2f63ba9"} Dec 04 13:01:02 crc kubenswrapper[4760]: I1204 13:01:02.373081 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414221-d4ff5" podStartSLOduration=2.3730583960000002 podStartE2EDuration="2.373058396s" podCreationTimestamp="2025-12-04 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 13:01:02.370839535 +0000 UTC m=+2865.412286112" watchObservedRunningTime="2025-12-04 13:01:02.373058396 +0000 UTC m=+2865.414504963" Dec 04 13:01:04 crc kubenswrapper[4760]: I1204 13:01:04.371981 4760 generic.go:334] "Generic (PLEG): container finished" podID="a5b9d885-9739-498a-bd0e-fd78e0d5c779" containerID="b5e05ed66314b2016fbc22474d19b60b6f97573141a3d12a295d40e43b395598" exitCode=0 Dec 04 13:01:04 crc kubenswrapper[4760]: I1204 13:01:04.372074 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414221-d4ff5" event={"ID":"a5b9d885-9739-498a-bd0e-fd78e0d5c779","Type":"ContainerDied","Data":"b5e05ed66314b2016fbc22474d19b60b6f97573141a3d12a295d40e43b395598"} Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.881682 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.890301 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data\") pod \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.890543 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys\") pod \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.890650 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle\") pod \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.890685 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hr78\" (UniqueName: \"kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78\") pod \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\" (UID: \"a5b9d885-9739-498a-bd0e-fd78e0d5c779\") " Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.896248 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78" (OuterVolumeSpecName: "kube-api-access-8hr78") pod "a5b9d885-9739-498a-bd0e-fd78e0d5c779" (UID: "a5b9d885-9739-498a-bd0e-fd78e0d5c779"). InnerVolumeSpecName "kube-api-access-8hr78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.897462 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a5b9d885-9739-498a-bd0e-fd78e0d5c779" (UID: "a5b9d885-9739-498a-bd0e-fd78e0d5c779"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.931172 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5b9d885-9739-498a-bd0e-fd78e0d5c779" (UID: "a5b9d885-9739-498a-bd0e-fd78e0d5c779"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.964692 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data" (OuterVolumeSpecName: "config-data") pod "a5b9d885-9739-498a-bd0e-fd78e0d5c779" (UID: "a5b9d885-9739-498a-bd0e-fd78e0d5c779"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.994815 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hr78\" (UniqueName: \"kubernetes.io/projected/a5b9d885-9739-498a-bd0e-fd78e0d5c779-kube-api-access-8hr78\") on node \"crc\" DevicePath \"\"" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.994943 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.995004 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 13:01:05 crc kubenswrapper[4760]: I1204 13:01:05.995058 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b9d885-9739-498a-bd0e-fd78e0d5c779-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 13:01:06 crc kubenswrapper[4760]: I1204 13:01:06.511579 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414221-d4ff5" event={"ID":"a5b9d885-9739-498a-bd0e-fd78e0d5c779","Type":"ContainerDied","Data":"abcfafd8b19b5c1b1e5c2e6ec5a01d697447c5e3d68f54094fb97152a2f63ba9"} Dec 04 13:01:06 crc kubenswrapper[4760]: I1204 13:01:06.511855 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abcfafd8b19b5c1b1e5c2e6ec5a01d697447c5e3d68f54094fb97152a2f63ba9" Dec 04 13:01:06 crc kubenswrapper[4760]: I1204 13:01:06.511656 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414221-d4ff5" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.313263 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:21 crc kubenswrapper[4760]: E1204 13:02:21.314373 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b9d885-9739-498a-bd0e-fd78e0d5c779" containerName="keystone-cron" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.314387 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b9d885-9739-498a-bd0e-fd78e0d5c779" containerName="keystone-cron" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.314594 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b9d885-9739-498a-bd0e-fd78e0d5c779" containerName="keystone-cron" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.316346 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.330006 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.378654 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.379389 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ttdk\" (UniqueName: \"kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.379600 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.482554 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.482643 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ttdk\" (UniqueName: \"kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.482711 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.483257 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.483280 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.503550 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ttdk\" (UniqueName: \"kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk\") pod \"community-operators-w244t\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:21 crc kubenswrapper[4760]: I1204 13:02:21.641359 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:22 crc kubenswrapper[4760]: I1204 13:02:22.237305 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:23 crc kubenswrapper[4760]: I1204 13:02:23.229605 4760 generic.go:334] "Generic (PLEG): container finished" podID="21050675-a756-485b-9d91-bf9f00c8fd48" containerID="6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae" exitCode=0 Dec 04 13:02:23 crc kubenswrapper[4760]: I1204 13:02:23.229678 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerDied","Data":"6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae"} Dec 04 13:02:23 crc kubenswrapper[4760]: I1204 13:02:23.230183 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerStarted","Data":"6ad8e522c4050e07f4b096d8c4092405388fa654dd22b07d16d8a7ec97db4fa5"} Dec 04 13:02:24 crc kubenswrapper[4760]: I1204 13:02:24.242306 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerStarted","Data":"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8"} Dec 04 13:02:25 crc kubenswrapper[4760]: I1204 13:02:25.253811 4760 generic.go:334] "Generic (PLEG): container finished" podID="21050675-a756-485b-9d91-bf9f00c8fd48" containerID="c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8" exitCode=0 Dec 04 13:02:25 crc kubenswrapper[4760]: I1204 13:02:25.253891 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerDied","Data":"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8"} Dec 04 13:02:26 crc kubenswrapper[4760]: I1204 13:02:26.269190 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerStarted","Data":"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a"} Dec 04 13:02:26 crc kubenswrapper[4760]: I1204 13:02:26.295742 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w244t" podStartSLOduration=2.815118484 podStartE2EDuration="5.29571615s" podCreationTimestamp="2025-12-04 13:02:21 +0000 UTC" firstStartedPulling="2025-12-04 13:02:23.232379299 +0000 UTC m=+2946.273825886" lastFinishedPulling="2025-12-04 13:02:25.712976985 +0000 UTC m=+2948.754423552" observedRunningTime="2025-12-04 13:02:26.287519279 +0000 UTC m=+2949.328965866" watchObservedRunningTime="2025-12-04 13:02:26.29571615 +0000 UTC m=+2949.337162717" Dec 04 13:02:31 crc kubenswrapper[4760]: I1204 13:02:31.642516 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:31 crc kubenswrapper[4760]: I1204 13:02:31.643121 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:31 crc kubenswrapper[4760]: I1204 13:02:31.692542 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:32 crc kubenswrapper[4760]: I1204 13:02:32.380575 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:32 crc kubenswrapper[4760]: I1204 13:02:32.447145 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:33 crc kubenswrapper[4760]: I1204 13:02:33.380513 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:02:33 crc kubenswrapper[4760]: I1204 13:02:33.380569 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.346393 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w244t" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="registry-server" containerID="cri-o://2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a" gracePeriod=2 Dec 04 13:02:34 crc kubenswrapper[4760]: E1204 13:02:34.651605 4760 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21050675_a756_485b_9d91_bf9f00c8fd48.slice/crio-conmon-2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a.scope\": RecentStats: unable to find data in memory cache]" Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.892602 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.998759 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities\") pod \"21050675-a756-485b-9d91-bf9f00c8fd48\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.998884 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content\") pod \"21050675-a756-485b-9d91-bf9f00c8fd48\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.998909 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ttdk\" (UniqueName: \"kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk\") pod \"21050675-a756-485b-9d91-bf9f00c8fd48\" (UID: \"21050675-a756-485b-9d91-bf9f00c8fd48\") " Dec 04 13:02:34 crc kubenswrapper[4760]: I1204 13:02:34.999881 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities" (OuterVolumeSpecName: "utilities") pod "21050675-a756-485b-9d91-bf9f00c8fd48" (UID: "21050675-a756-485b-9d91-bf9f00c8fd48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.004584 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk" (OuterVolumeSpecName: "kube-api-access-8ttdk") pod "21050675-a756-485b-9d91-bf9f00c8fd48" (UID: "21050675-a756-485b-9d91-bf9f00c8fd48"). InnerVolumeSpecName "kube-api-access-8ttdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.051899 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21050675-a756-485b-9d91-bf9f00c8fd48" (UID: "21050675-a756-485b-9d91-bf9f00c8fd48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.101827 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.101868 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21050675-a756-485b-9d91-bf9f00c8fd48-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.101885 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ttdk\" (UniqueName: \"kubernetes.io/projected/21050675-a756-485b-9d91-bf9f00c8fd48-kube-api-access-8ttdk\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.358098 4760 generic.go:334] "Generic (PLEG): container finished" podID="21050675-a756-485b-9d91-bf9f00c8fd48" containerID="2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a" exitCode=0 Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.358149 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerDied","Data":"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a"} Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.358177 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w244t" event={"ID":"21050675-a756-485b-9d91-bf9f00c8fd48","Type":"ContainerDied","Data":"6ad8e522c4050e07f4b096d8c4092405388fa654dd22b07d16d8a7ec97db4fa5"} Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.358194 4760 scope.go:117] "RemoveContainer" containerID="2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.358349 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w244t" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.396866 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.410118 4760 scope.go:117] "RemoveContainer" containerID="c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.410645 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w244t"] Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.439682 4760 scope.go:117] "RemoveContainer" containerID="6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.486017 4760 scope.go:117] "RemoveContainer" containerID="2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a" Dec 04 13:02:35 crc kubenswrapper[4760]: E1204 13:02:35.490610 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a\": container with ID starting with 2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a not found: ID does not exist" containerID="2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.490659 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a"} err="failed to get container status \"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a\": rpc error: code = NotFound desc = could not find container \"2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a\": container with ID starting with 2c8cb34dc505243b6035f33aa88c6e09cdc855d164d314555435968f8bb3395a not found: ID does not exist" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.490690 4760 scope.go:117] "RemoveContainer" containerID="c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8" Dec 04 13:02:35 crc kubenswrapper[4760]: E1204 13:02:35.491250 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8\": container with ID starting with c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8 not found: ID does not exist" containerID="c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.491276 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8"} err="failed to get container status \"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8\": rpc error: code = NotFound desc = could not find container \"c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8\": container with ID starting with c15a745c7a6b522dd300fa745740305cfba3fb1675dea53b2c2cd344d37e1bd8 not found: ID does not exist" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.491295 4760 scope.go:117] "RemoveContainer" containerID="6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae" Dec 04 13:02:35 crc kubenswrapper[4760]: E1204 13:02:35.491744 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae\": container with ID starting with 6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae not found: ID does not exist" containerID="6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.491766 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae"} err="failed to get container status \"6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae\": rpc error: code = NotFound desc = could not find container \"6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae\": container with ID starting with 6fd54ee5b3a87850c0f235dc8b8ecb07b938a60e0562afc4e3e2076dd904deae not found: ID does not exist" Dec 04 13:02:35 crc kubenswrapper[4760]: I1204 13:02:35.876529 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" path="/var/lib/kubelet/pods/21050675-a756-485b-9d91-bf9f00c8fd48/volumes" Dec 04 13:02:42 crc kubenswrapper[4760]: I1204 13:02:42.857943 4760 generic.go:334] "Generic (PLEG): container finished" podID="b8bba20c-b75b-40da-98ad-436a4d121d13" containerID="0d623ba0e48388c48659430a6296dc295cb6806d8f253a4f545d87dd78d15d5e" exitCode=0 Dec 04 13:02:42 crc kubenswrapper[4760]: I1204 13:02:42.858049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" event={"ID":"b8bba20c-b75b-40da-98ad-436a4d121d13","Type":"ContainerDied","Data":"0d623ba0e48388c48659430a6296dc295cb6806d8f253a4f545d87dd78d15d5e"} Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.340585 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.490796 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.490998 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491026 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491119 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491201 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491258 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x8hx\" (UniqueName: \"kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491298 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491322 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.491362 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0\") pod \"b8bba20c-b75b-40da-98ad-436a4d121d13\" (UID: \"b8bba20c-b75b-40da-98ad-436a4d121d13\") " Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.497152 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx" (OuterVolumeSpecName: "kube-api-access-2x8hx") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "kube-api-access-2x8hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.502029 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.524456 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.524652 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.526856 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.530603 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.531126 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.544730 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.545290 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory" (OuterVolumeSpecName: "inventory") pod "b8bba20c-b75b-40da-98ad-436a4d121d13" (UID: "b8bba20c-b75b-40da-98ad-436a4d121d13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594838 4760 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594871 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x8hx\" (UniqueName: \"kubernetes.io/projected/b8bba20c-b75b-40da-98ad-436a4d121d13-kube-api-access-2x8hx\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594882 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594892 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594902 4760 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594911 4760 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594942 4760 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594955 4760 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.594964 4760 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8bba20c-b75b-40da-98ad-436a4d121d13-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.875604 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" event={"ID":"b8bba20c-b75b-40da-98ad-436a4d121d13","Type":"ContainerDied","Data":"fca8f6e8698e692b1b59f712d14070e4a5d6a70fe87cf0e01929a1564398d2aa"} Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.875641 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fca8f6e8698e692b1b59f712d14070e4a5d6a70fe87cf0e01929a1564398d2aa" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.875685 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fx275" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.991668 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc"] Dec 04 13:02:44 crc kubenswrapper[4760]: E1204 13:02:44.992249 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="registry-server" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992273 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="registry-server" Dec 04 13:02:44 crc kubenswrapper[4760]: E1204 13:02:44.992296 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="extract-utilities" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992307 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="extract-utilities" Dec 04 13:02:44 crc kubenswrapper[4760]: E1204 13:02:44.992337 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="extract-content" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992346 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="extract-content" Dec 04 13:02:44 crc kubenswrapper[4760]: E1204 13:02:44.992361 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bba20c-b75b-40da-98ad-436a4d121d13" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992370 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bba20c-b75b-40da-98ad-436a4d121d13" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992646 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="21050675-a756-485b-9d91-bf9f00c8fd48" containerName="registry-server" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.992670 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bba20c-b75b-40da-98ad-436a4d121d13" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.993611 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.996679 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.997944 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wm7t9" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.998134 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 04 13:02:44 crc kubenswrapper[4760]: I1204 13:02:44.998329 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002318 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002383 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002433 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002465 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002516 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002560 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.002621 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxzd\" (UniqueName: \"kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.003052 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.003866 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc"] Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxzd\" (UniqueName: \"kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104540 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104579 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104603 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104666 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.104700 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.108574 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.108637 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.108815 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.109199 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.110260 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.111653 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.123990 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnxzd\" (UniqueName: \"kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:45 crc kubenswrapper[4760]: I1204 13:02:45.314058 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:02:46 crc kubenswrapper[4760]: I1204 13:02:46.003094 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc"] Dec 04 13:02:46 crc kubenswrapper[4760]: I1204 13:02:46.893511 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" event={"ID":"6b99a8e4-6932-4867-b485-872dfefcf4fc","Type":"ContainerStarted","Data":"47c79775fa567a28c40693acfa906b75ae106a1cc6bf41f0948419e707b9f43c"} Dec 04 13:02:46 crc kubenswrapper[4760]: I1204 13:02:46.893865 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" event={"ID":"6b99a8e4-6932-4867-b485-872dfefcf4fc","Type":"ContainerStarted","Data":"7dd75e3e73bc66360184f999a1a55620809bf71b045bb16269a265bf9b9a0408"} Dec 04 13:02:46 crc kubenswrapper[4760]: I1204 13:02:46.913617 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" podStartSLOduration=2.425563627 podStartE2EDuration="2.913586292s" podCreationTimestamp="2025-12-04 13:02:44 +0000 UTC" firstStartedPulling="2025-12-04 13:02:46.000097262 +0000 UTC m=+2969.041543829" lastFinishedPulling="2025-12-04 13:02:46.488119927 +0000 UTC m=+2969.529566494" observedRunningTime="2025-12-04 13:02:46.908052716 +0000 UTC m=+2969.949499293" watchObservedRunningTime="2025-12-04 13:02:46.913586292 +0000 UTC m=+2969.955032859" Dec 04 13:03:03 crc kubenswrapper[4760]: I1204 13:03:03.380312 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:03:03 crc kubenswrapper[4760]: I1204 13:03:03.380959 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.675986 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.686133 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.703734 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.805170 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.811455 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.818101 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.858554 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfsh\" (UniqueName: \"kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.859502 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.859645 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.961375 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.961537 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.961804 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cld64\" (UniqueName: \"kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.961935 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.962010 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.962083 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfsh\" (UniqueName: \"kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.962558 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.962691 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:11 crc kubenswrapper[4760]: I1204 13:03:11.996528 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfsh\" (UniqueName: \"kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh\") pod \"certified-operators-2bb5l\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.013925 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.064390 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.064959 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cld64\" (UniqueName: \"kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.065028 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.066571 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.067813 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.084187 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cld64\" (UniqueName: \"kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64\") pod \"redhat-marketplace-slngp\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.141476 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.707713 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:12 crc kubenswrapper[4760]: I1204 13:03:12.868093 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:12 crc kubenswrapper[4760]: W1204 13:03:12.872484 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c5ec8f_41b1_45c5_916e_1c4f87cce7f3.slice/crio-b1ea9460f29528b9fefd021828b423972dda97f1dfce8969a94c6e3d0fbbb399 WatchSource:0}: Error finding container b1ea9460f29528b9fefd021828b423972dda97f1dfce8969a94c6e3d0fbbb399: Status 404 returned error can't find the container with id b1ea9460f29528b9fefd021828b423972dda97f1dfce8969a94c6e3d0fbbb399 Dec 04 13:03:13 crc kubenswrapper[4760]: I1204 13:03:13.146946 4760 generic.go:334] "Generic (PLEG): container finished" podID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerID="7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f" exitCode=0 Dec 04 13:03:13 crc kubenswrapper[4760]: I1204 13:03:13.147023 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerDied","Data":"7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f"} Dec 04 13:03:13 crc kubenswrapper[4760]: I1204 13:03:13.147053 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerStarted","Data":"29b1f928af0211bc79f24e77477efd00cabfff96d0fc4509ac3d4fdb3df79190"} Dec 04 13:03:13 crc kubenswrapper[4760]: I1204 13:03:13.149044 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerStarted","Data":"b1ea9460f29528b9fefd021828b423972dda97f1dfce8969a94c6e3d0fbbb399"} Dec 04 13:03:14 crc kubenswrapper[4760]: I1204 13:03:14.160717 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerStarted","Data":"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0"} Dec 04 13:03:14 crc kubenswrapper[4760]: I1204 13:03:14.163108 4760 generic.go:334] "Generic (PLEG): container finished" podID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerID="6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1" exitCode=0 Dec 04 13:03:14 crc kubenswrapper[4760]: I1204 13:03:14.163150 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerDied","Data":"6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1"} Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.002993 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.024854 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.035033 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.142883 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.143612 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65jf\" (UniqueName: \"kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.143833 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.174311 4760 generic.go:334] "Generic (PLEG): container finished" podID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerID="e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0" exitCode=0 Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.174559 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerDied","Data":"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0"} Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.181867 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerStarted","Data":"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b"} Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.246635 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.246707 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65jf\" (UniqueName: \"kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.246789 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.247301 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.247396 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.272866 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65jf\" (UniqueName: \"kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf\") pod \"redhat-operators-95t6c\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:15 crc kubenswrapper[4760]: I1204 13:03:15.378588 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:16 crc kubenswrapper[4760]: I1204 13:03:16.075017 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:16 crc kubenswrapper[4760]: W1204 13:03:16.077891 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c2512d_fddd_45b2_8ddb_1211a8b274db.slice/crio-adc216004ff967c329d78d1bce4bcb086f1c33a8b1f2b4882e8d627e2891a9b2 WatchSource:0}: Error finding container adc216004ff967c329d78d1bce4bcb086f1c33a8b1f2b4882e8d627e2891a9b2: Status 404 returned error can't find the container with id adc216004ff967c329d78d1bce4bcb086f1c33a8b1f2b4882e8d627e2891a9b2 Dec 04 13:03:16 crc kubenswrapper[4760]: I1204 13:03:16.196089 4760 generic.go:334] "Generic (PLEG): container finished" podID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerID="3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b" exitCode=0 Dec 04 13:03:16 crc kubenswrapper[4760]: I1204 13:03:16.196556 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerDied","Data":"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b"} Dec 04 13:03:16 crc kubenswrapper[4760]: I1204 13:03:16.198948 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerStarted","Data":"adc216004ff967c329d78d1bce4bcb086f1c33a8b1f2b4882e8d627e2891a9b2"} Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.209562 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerStarted","Data":"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9"} Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.212453 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerStarted","Data":"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8"} Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.214271 4760 generic.go:334] "Generic (PLEG): container finished" podID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerID="c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000" exitCode=0 Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.214310 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerDied","Data":"c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000"} Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.244113 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2bb5l" podStartSLOduration=3.535125014 podStartE2EDuration="6.244093789s" podCreationTimestamp="2025-12-04 13:03:11 +0000 UTC" firstStartedPulling="2025-12-04 13:03:13.149816585 +0000 UTC m=+2996.191263152" lastFinishedPulling="2025-12-04 13:03:15.85878536 +0000 UTC m=+2998.900231927" observedRunningTime="2025-12-04 13:03:17.239473681 +0000 UTC m=+3000.280920258" watchObservedRunningTime="2025-12-04 13:03:17.244093789 +0000 UTC m=+3000.285540356" Dec 04 13:03:17 crc kubenswrapper[4760]: I1204 13:03:17.290994 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-slngp" podStartSLOduration=3.784367535 podStartE2EDuration="6.290974229s" podCreationTimestamp="2025-12-04 13:03:11 +0000 UTC" firstStartedPulling="2025-12-04 13:03:14.165312836 +0000 UTC m=+2997.206759403" lastFinishedPulling="2025-12-04 13:03:16.67191953 +0000 UTC m=+2999.713366097" observedRunningTime="2025-12-04 13:03:17.285299858 +0000 UTC m=+3000.326746445" watchObservedRunningTime="2025-12-04 13:03:17.290974229 +0000 UTC m=+3000.332420796" Dec 04 13:03:18 crc kubenswrapper[4760]: I1204 13:03:18.224150 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerStarted","Data":"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873"} Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.014962 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.015910 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.142111 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.143679 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.188771 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.321701 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:22 crc kubenswrapper[4760]: I1204 13:03:22.604423 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:23 crc kubenswrapper[4760]: I1204 13:03:23.082361 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2bb5l" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="registry-server" probeResult="failure" output=< Dec 04 13:03:23 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:03:23 crc kubenswrapper[4760]: > Dec 04 13:03:23 crc kubenswrapper[4760]: I1204 13:03:23.275933 4760 generic.go:334] "Generic (PLEG): container finished" podID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerID="bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873" exitCode=0 Dec 04 13:03:23 crc kubenswrapper[4760]: I1204 13:03:23.276301 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerDied","Data":"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873"} Dec 04 13:03:24 crc kubenswrapper[4760]: I1204 13:03:24.286830 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerStarted","Data":"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96"} Dec 04 13:03:24 crc kubenswrapper[4760]: I1204 13:03:24.286985 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-slngp" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="registry-server" containerID="cri-o://6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8" gracePeriod=2 Dec 04 13:03:24 crc kubenswrapper[4760]: I1204 13:03:24.310646 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95t6c" podStartSLOduration=3.854380399 podStartE2EDuration="10.310623566s" podCreationTimestamp="2025-12-04 13:03:14 +0000 UTC" firstStartedPulling="2025-12-04 13:03:17.216073178 +0000 UTC m=+3000.257519745" lastFinishedPulling="2025-12-04 13:03:23.672316345 +0000 UTC m=+3006.713762912" observedRunningTime="2025-12-04 13:03:24.305597647 +0000 UTC m=+3007.347044234" watchObservedRunningTime="2025-12-04 13:03:24.310623566 +0000 UTC m=+3007.352070133" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.238041 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.300037 4760 generic.go:334] "Generic (PLEG): container finished" podID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerID="6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8" exitCode=0 Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.300091 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerDied","Data":"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8"} Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.300114 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slngp" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.300133 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slngp" event={"ID":"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3","Type":"ContainerDied","Data":"b1ea9460f29528b9fefd021828b423972dda97f1dfce8969a94c6e3d0fbbb399"} Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.300159 4760 scope.go:117] "RemoveContainer" containerID="6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.333708 4760 scope.go:117] "RemoveContainer" containerID="3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.357606 4760 scope.go:117] "RemoveContainer" containerID="6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.379637 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.379693 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.396917 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities\") pod \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.397102 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cld64\" (UniqueName: \"kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64\") pod \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.397413 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content\") pod \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\" (UID: \"19c5ec8f-41b1-45c5-916e-1c4f87cce7f3\") " Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.397587 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities" (OuterVolumeSpecName: "utilities") pod "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" (UID: "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.398460 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.404054 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64" (OuterVolumeSpecName: "kube-api-access-cld64") pod "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" (UID: "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3"). InnerVolumeSpecName "kube-api-access-cld64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.405655 4760 scope.go:117] "RemoveContainer" containerID="6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8" Dec 04 13:03:25 crc kubenswrapper[4760]: E1204 13:03:25.408715 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8\": container with ID starting with 6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8 not found: ID does not exist" containerID="6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.408777 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8"} err="failed to get container status \"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8\": rpc error: code = NotFound desc = could not find container \"6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8\": container with ID starting with 6b767233925b4205419aa877dce4760a2737df58e3763a02acb4d1fb222c7ac8 not found: ID does not exist" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.408815 4760 scope.go:117] "RemoveContainer" containerID="3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b" Dec 04 13:03:25 crc kubenswrapper[4760]: E1204 13:03:25.411604 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b\": container with ID starting with 3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b not found: ID does not exist" containerID="3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.411639 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b"} err="failed to get container status \"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b\": rpc error: code = NotFound desc = could not find container \"3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b\": container with ID starting with 3b32f9aea4f6fc4d397f8e4764610cd921af970b10680bb99d221ee0f11de32b not found: ID does not exist" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.411659 4760 scope.go:117] "RemoveContainer" containerID="6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1" Dec 04 13:03:25 crc kubenswrapper[4760]: E1204 13:03:25.412340 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1\": container with ID starting with 6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1 not found: ID does not exist" containerID="6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.412394 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1"} err="failed to get container status \"6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1\": rpc error: code = NotFound desc = could not find container \"6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1\": container with ID starting with 6b3aec24b9b39890c73c4f28276b67e53f78ac60f84029740d34b8440be1eae1 not found: ID does not exist" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.415255 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" (UID: "19c5ec8f-41b1-45c5-916e-1c4f87cce7f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.499910 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cld64\" (UniqueName: \"kubernetes.io/projected/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-kube-api-access-cld64\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.499948 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.633746 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.643024 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-slngp"] Dec 04 13:03:25 crc kubenswrapper[4760]: I1204 13:03:25.875541 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" path="/var/lib/kubelet/pods/19c5ec8f-41b1-45c5-916e-1c4f87cce7f3/volumes" Dec 04 13:03:26 crc kubenswrapper[4760]: I1204 13:03:26.434975 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95t6c" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="registry-server" probeResult="failure" output=< Dec 04 13:03:26 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:03:26 crc kubenswrapper[4760]: > Dec 04 13:03:32 crc kubenswrapper[4760]: I1204 13:03:32.066549 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:32 crc kubenswrapper[4760]: I1204 13:03:32.115627 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:32 crc kubenswrapper[4760]: I1204 13:03:32.328719 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.371430 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2bb5l" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="registry-server" containerID="cri-o://1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9" gracePeriod=2 Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.380857 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.380923 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.380974 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.382476 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.382557 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" gracePeriod=600 Dec 04 13:03:33 crc kubenswrapper[4760]: E1204 13:03:33.544103 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:03:33 crc kubenswrapper[4760]: I1204 13:03:33.901994 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.099008 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhfsh\" (UniqueName: \"kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh\") pod \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.099550 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content\") pod \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.099824 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities\") pod \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\" (UID: \"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87\") " Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.100299 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities" (OuterVolumeSpecName: "utilities") pod "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" (UID: "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.100976 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.108690 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh" (OuterVolumeSpecName: "kube-api-access-mhfsh") pod "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" (UID: "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87"). InnerVolumeSpecName "kube-api-access-mhfsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.156759 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" (UID: "5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.202913 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhfsh\" (UniqueName: \"kubernetes.io/projected/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-kube-api-access-mhfsh\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.202945 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.383983 4760 generic.go:334] "Generic (PLEG): container finished" podID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerID="1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9" exitCode=0 Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.384058 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bb5l" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.384080 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerDied","Data":"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9"} Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.387404 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bb5l" event={"ID":"5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87","Type":"ContainerDied","Data":"29b1f928af0211bc79f24e77477efd00cabfff96d0fc4509ac3d4fdb3df79190"} Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.387447 4760 scope.go:117] "RemoveContainer" containerID="1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.390151 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" exitCode=0 Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.390231 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184"} Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.391082 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:03:34 crc kubenswrapper[4760]: E1204 13:03:34.391538 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.430358 4760 scope.go:117] "RemoveContainer" containerID="e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.451411 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.463184 4760 scope.go:117] "RemoveContainer" containerID="7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.466458 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2bb5l"] Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.487698 4760 scope.go:117] "RemoveContainer" containerID="1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9" Dec 04 13:03:34 crc kubenswrapper[4760]: E1204 13:03:34.488180 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9\": container with ID starting with 1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9 not found: ID does not exist" containerID="1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.488322 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9"} err="failed to get container status \"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9\": rpc error: code = NotFound desc = could not find container \"1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9\": container with ID starting with 1ff5b73d88af65f93e4824d348ac2e28eaadeab0c88d3369337ce9857ce34eb9 not found: ID does not exist" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.488482 4760 scope.go:117] "RemoveContainer" containerID="e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0" Dec 04 13:03:34 crc kubenswrapper[4760]: E1204 13:03:34.488977 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0\": container with ID starting with e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0 not found: ID does not exist" containerID="e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.489010 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0"} err="failed to get container status \"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0\": rpc error: code = NotFound desc = could not find container \"e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0\": container with ID starting with e183ba1f78c6eac986f35cc40a8b0afb5d7fcd5643195f92403da3e6ce95e8c0 not found: ID does not exist" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.489033 4760 scope.go:117] "RemoveContainer" containerID="7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f" Dec 04 13:03:34 crc kubenswrapper[4760]: E1204 13:03:34.489277 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f\": container with ID starting with 7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f not found: ID does not exist" containerID="7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.489374 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f"} err="failed to get container status \"7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f\": rpc error: code = NotFound desc = could not find container \"7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f\": container with ID starting with 7faf50124edd5f188ed79ebb0d0a5d3fc33fcc6a06b5212cafcae6791e7d105f not found: ID does not exist" Dec 04 13:03:34 crc kubenswrapper[4760]: I1204 13:03:34.489471 4760 scope.go:117] "RemoveContainer" containerID="233692fc76a20916f8002a1dc862a924e8d80bcf11a573807153e1e74b91d84e" Dec 04 13:03:35 crc kubenswrapper[4760]: I1204 13:03:35.455354 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:35 crc kubenswrapper[4760]: I1204 13:03:35.531602 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:35 crc kubenswrapper[4760]: I1204 13:03:35.878787 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" path="/var/lib/kubelet/pods/5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87/volumes" Dec 04 13:03:36 crc kubenswrapper[4760]: I1204 13:03:36.707109 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:37 crc kubenswrapper[4760]: I1204 13:03:37.427409 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95t6c" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="registry-server" containerID="cri-o://67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96" gracePeriod=2 Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.032386 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.123824 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content\") pod \"50c2512d-fddd-45b2-8ddb-1211a8b274db\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.124005 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l65jf\" (UniqueName: \"kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf\") pod \"50c2512d-fddd-45b2-8ddb-1211a8b274db\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.124139 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities\") pod \"50c2512d-fddd-45b2-8ddb-1211a8b274db\" (UID: \"50c2512d-fddd-45b2-8ddb-1211a8b274db\") " Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.124937 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities" (OuterVolumeSpecName: "utilities") pod "50c2512d-fddd-45b2-8ddb-1211a8b274db" (UID: "50c2512d-fddd-45b2-8ddb-1211a8b274db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.130568 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf" (OuterVolumeSpecName: "kube-api-access-l65jf") pod "50c2512d-fddd-45b2-8ddb-1211a8b274db" (UID: "50c2512d-fddd-45b2-8ddb-1211a8b274db"). InnerVolumeSpecName "kube-api-access-l65jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.226853 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l65jf\" (UniqueName: \"kubernetes.io/projected/50c2512d-fddd-45b2-8ddb-1211a8b274db-kube-api-access-l65jf\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.226896 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.236534 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50c2512d-fddd-45b2-8ddb-1211a8b274db" (UID: "50c2512d-fddd-45b2-8ddb-1211a8b274db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.329059 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c2512d-fddd-45b2-8ddb-1211a8b274db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.438676 4760 generic.go:334] "Generic (PLEG): container finished" podID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerID="67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96" exitCode=0 Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.438728 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerDied","Data":"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96"} Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.438769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95t6c" event={"ID":"50c2512d-fddd-45b2-8ddb-1211a8b274db","Type":"ContainerDied","Data":"adc216004ff967c329d78d1bce4bcb086f1c33a8b1f2b4882e8d627e2891a9b2"} Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.438792 4760 scope.go:117] "RemoveContainer" containerID="67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.439158 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95t6c" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.475684 4760 scope.go:117] "RemoveContainer" containerID="bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.485706 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.496132 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95t6c"] Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.497788 4760 scope.go:117] "RemoveContainer" containerID="c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.550159 4760 scope.go:117] "RemoveContainer" containerID="67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96" Dec 04 13:03:38 crc kubenswrapper[4760]: E1204 13:03:38.554667 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96\": container with ID starting with 67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96 not found: ID does not exist" containerID="67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.554727 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96"} err="failed to get container status \"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96\": rpc error: code = NotFound desc = could not find container \"67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96\": container with ID starting with 67caddf7d098a0b82f420d87f1ff5529c2b8dc8a32f67f1917e1afa982b5ec96 not found: ID does not exist" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.554769 4760 scope.go:117] "RemoveContainer" containerID="bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873" Dec 04 13:03:38 crc kubenswrapper[4760]: E1204 13:03:38.555276 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873\": container with ID starting with bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873 not found: ID does not exist" containerID="bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.555430 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873"} err="failed to get container status \"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873\": rpc error: code = NotFound desc = could not find container \"bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873\": container with ID starting with bba83fbb76d2e4c4657966054077df1d9b842fef15bf1bf4948161ec37f68873 not found: ID does not exist" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.555546 4760 scope.go:117] "RemoveContainer" containerID="c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000" Dec 04 13:03:38 crc kubenswrapper[4760]: E1204 13:03:38.556089 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000\": container with ID starting with c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000 not found: ID does not exist" containerID="c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000" Dec 04 13:03:38 crc kubenswrapper[4760]: I1204 13:03:38.556202 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000"} err="failed to get container status \"c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000\": rpc error: code = NotFound desc = could not find container \"c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000\": container with ID starting with c615181c718f833029d40dd4fd92ebfd28e7ada7338bfe9cfd6a81719f050000 not found: ID does not exist" Dec 04 13:03:39 crc kubenswrapper[4760]: I1204 13:03:39.877158 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" path="/var/lib/kubelet/pods/50c2512d-fddd-45b2-8ddb-1211a8b274db/volumes" Dec 04 13:03:44 crc kubenswrapper[4760]: I1204 13:03:44.863849 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:03:44 crc kubenswrapper[4760]: E1204 13:03:44.864626 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:03:56 crc kubenswrapper[4760]: I1204 13:03:56.864266 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:03:56 crc kubenswrapper[4760]: E1204 13:03:56.865181 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:04:10 crc kubenswrapper[4760]: I1204 13:04:10.864435 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:04:10 crc kubenswrapper[4760]: E1204 13:04:10.865108 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:04:25 crc kubenswrapper[4760]: I1204 13:04:25.864683 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:04:25 crc kubenswrapper[4760]: E1204 13:04:25.865387 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:04:36 crc kubenswrapper[4760]: I1204 13:04:36.864948 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:04:36 crc kubenswrapper[4760]: E1204 13:04:36.865806 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:04:50 crc kubenswrapper[4760]: I1204 13:04:50.865043 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:04:50 crc kubenswrapper[4760]: E1204 13:04:50.867780 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:05:04 crc kubenswrapper[4760]: I1204 13:05:04.864508 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:05:04 crc kubenswrapper[4760]: E1204 13:05:04.866245 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:05:07 crc kubenswrapper[4760]: I1204 13:05:07.372346 4760 generic.go:334] "Generic (PLEG): container finished" podID="6b99a8e4-6932-4867-b485-872dfefcf4fc" containerID="47c79775fa567a28c40693acfa906b75ae106a1cc6bf41f0948419e707b9f43c" exitCode=0 Dec 04 13:05:07 crc kubenswrapper[4760]: I1204 13:05:07.372413 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" event={"ID":"6b99a8e4-6932-4867-b485-872dfefcf4fc","Type":"ContainerDied","Data":"47c79775fa567a28c40693acfa906b75ae106a1cc6bf41f0948419e707b9f43c"} Dec 04 13:05:08 crc kubenswrapper[4760]: I1204 13:05:08.882418 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011432 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011603 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011817 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011880 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011905 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnxzd\" (UniqueName: \"kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011943 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.011974 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key\") pod \"6b99a8e4-6932-4867-b485-872dfefcf4fc\" (UID: \"6b99a8e4-6932-4867-b485-872dfefcf4fc\") " Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.018346 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd" (OuterVolumeSpecName: "kube-api-access-dnxzd") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "kube-api-access-dnxzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.019261 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.041768 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.042199 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.045518 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.047662 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.051989 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory" (OuterVolumeSpecName: "inventory") pod "6b99a8e4-6932-4867-b485-872dfefcf4fc" (UID: "6b99a8e4-6932-4867-b485-872dfefcf4fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114152 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114188 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114199 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnxzd\" (UniqueName: \"kubernetes.io/projected/6b99a8e4-6932-4867-b485-872dfefcf4fc-kube-api-access-dnxzd\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114230 4760 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114250 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114259 4760 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.114268 4760 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/6b99a8e4-6932-4867-b485-872dfefcf4fc-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.395733 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" event={"ID":"6b99a8e4-6932-4867-b485-872dfefcf4fc","Type":"ContainerDied","Data":"7dd75e3e73bc66360184f999a1a55620809bf71b045bb16269a265bf9b9a0408"} Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.396061 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dd75e3e73bc66360184f999a1a55620809bf71b045bb16269a265bf9b9a0408" Dec 04 13:05:09 crc kubenswrapper[4760]: I1204 13:05:09.395775 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc" Dec 04 13:05:17 crc kubenswrapper[4760]: I1204 13:05:17.873895 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:05:17 crc kubenswrapper[4760]: E1204 13:05:17.874892 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:05:29 crc kubenswrapper[4760]: I1204 13:05:29.864689 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:05:29 crc kubenswrapper[4760]: E1204 13:05:29.866008 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:05:41 crc kubenswrapper[4760]: I1204 13:05:41.864571 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:05:41 crc kubenswrapper[4760]: E1204 13:05:41.866526 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:05:52 crc kubenswrapper[4760]: I1204 13:05:52.864781 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:05:52 crc kubenswrapper[4760]: E1204 13:05:52.865590 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:06:03 crc kubenswrapper[4760]: I1204 13:06:03.865402 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:06:03 crc kubenswrapper[4760]: E1204 13:06:03.866138 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:06:14 crc kubenswrapper[4760]: I1204 13:06:14.864782 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:06:14 crc kubenswrapper[4760]: E1204 13:06:14.865601 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.751755 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752728 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752743 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752756 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752761 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752778 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752785 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752803 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b99a8e4-6932-4867-b485-872dfefcf4fc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752812 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b99a8e4-6932-4867-b485-872dfefcf4fc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752828 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752836 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752847 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752853 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752873 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752880 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752901 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752908 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752922 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752929 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="extract-content" Dec 04 13:06:18 crc kubenswrapper[4760]: E1204 13:06:18.752947 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.752956 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="extract-utilities" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.753190 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c5ec8f-41b1-45c5-916e-1c4f87cce7f3" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.753225 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b99a8e4-6932-4867-b485-872dfefcf4fc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.753247 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c2512d-fddd-45b2-8ddb-1211a8b274db" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.753255 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fec2b8a-472b-4fb7-9e4d-9e8d8a2ecb87" containerName="registry-server" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.753988 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.763663 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.763844 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.763906 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.764174 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938480 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938601 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938637 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xqj\" (UniqueName: \"kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938672 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938696 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938834 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.938995 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.939695 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:18 crc kubenswrapper[4760]: I1204 13:06:18.939910 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042149 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042284 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042365 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042442 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xqj\" (UniqueName: \"kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042483 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042505 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042551 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.042587 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.043691 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.048079 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.048248 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.048909 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.049175 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.054346 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.054452 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.065697 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.068989 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xqj\" (UniqueName: \"kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.079277 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.374122 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.815602 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 13:06:19 crc kubenswrapper[4760]: I1204 13:06:19.825112 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:06:20 crc kubenswrapper[4760]: I1204 13:06:20.068825 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef91667b-5e29-49a0-9de9-d557462e96c0","Type":"ContainerStarted","Data":"516cafa196dc40be073ba31a51ada137cfbda3b4ffc5cff8f652150ad50777f1"} Dec 04 13:06:29 crc kubenswrapper[4760]: I1204 13:06:29.864473 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:06:29 crc kubenswrapper[4760]: E1204 13:06:29.865740 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:06:43 crc kubenswrapper[4760]: I1204 13:06:43.864902 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:06:43 crc kubenswrapper[4760]: E1204 13:06:43.866331 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:06:52 crc kubenswrapper[4760]: E1204 13:06:52.404125 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 04 13:06:52 crc kubenswrapper[4760]: E1204 13:06:52.405127 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5xqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ef91667b-5e29-49a0-9de9-d557462e96c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 13:06:52 crc kubenswrapper[4760]: E1204 13:06:52.406563 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ef91667b-5e29-49a0-9de9-d557462e96c0" Dec 04 13:06:53 crc kubenswrapper[4760]: E1204 13:06:53.090120 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ef91667b-5e29-49a0-9de9-d557462e96c0" Dec 04 13:06:55 crc kubenswrapper[4760]: I1204 13:06:55.865176 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:06:55 crc kubenswrapper[4760]: E1204 13:06:55.866328 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:07:08 crc kubenswrapper[4760]: I1204 13:07:08.864464 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:07:08 crc kubenswrapper[4760]: E1204 13:07:08.865332 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:07:09 crc kubenswrapper[4760]: I1204 13:07:09.264862 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef91667b-5e29-49a0-9de9-d557462e96c0","Type":"ContainerStarted","Data":"5e7754fcd7bf18a0518f187a1d01439425bfede59fe05a113042b1164f10ca12"} Dec 04 13:07:23 crc kubenswrapper[4760]: I1204 13:07:23.864821 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:07:23 crc kubenswrapper[4760]: E1204 13:07:23.866122 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:07:34 crc kubenswrapper[4760]: I1204 13:07:34.864851 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:07:34 crc kubenswrapper[4760]: E1204 13:07:34.865591 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:07:45 crc kubenswrapper[4760]: I1204 13:07:45.865021 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:07:45 crc kubenswrapper[4760]: E1204 13:07:45.866101 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:07:56 crc kubenswrapper[4760]: I1204 13:07:56.864662 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:07:56 crc kubenswrapper[4760]: E1204 13:07:56.866063 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:08:07 crc kubenswrapper[4760]: I1204 13:08:07.871026 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:08:07 crc kubenswrapper[4760]: E1204 13:08:07.871867 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:08:18 crc kubenswrapper[4760]: I1204 13:08:18.864348 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:08:18 crc kubenswrapper[4760]: E1204 13:08:18.865171 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:08:33 crc kubenswrapper[4760]: I1204 13:08:33.866451 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:08:35 crc kubenswrapper[4760]: I1204 13:08:35.347036 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68"} Dec 04 13:08:35 crc kubenswrapper[4760]: I1204 13:08:35.368892 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=90.910989116 podStartE2EDuration="2m18.368837423s" podCreationTimestamp="2025-12-04 13:06:17 +0000 UTC" firstStartedPulling="2025-12-04 13:06:19.824709999 +0000 UTC m=+3182.866156576" lastFinishedPulling="2025-12-04 13:07:07.282558316 +0000 UTC m=+3230.324004883" observedRunningTime="2025-12-04 13:07:09.287028626 +0000 UTC m=+3232.328475193" watchObservedRunningTime="2025-12-04 13:08:35.368837423 +0000 UTC m=+3318.410284000" Dec 04 13:11:03 crc kubenswrapper[4760]: I1204 13:11:03.380625 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:11:03 crc kubenswrapper[4760]: I1204 13:11:03.381139 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:11:33 crc kubenswrapper[4760]: I1204 13:11:33.379982 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:11:33 crc kubenswrapper[4760]: I1204 13:11:33.381873 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:12:03 crc kubenswrapper[4760]: I1204 13:12:03.380590 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:12:03 crc kubenswrapper[4760]: I1204 13:12:03.381272 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:12:03 crc kubenswrapper[4760]: I1204 13:12:03.381314 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:12:03 crc kubenswrapper[4760]: I1204 13:12:03.382124 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:12:03 crc kubenswrapper[4760]: I1204 13:12:03.382168 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68" gracePeriod=600 Dec 04 13:12:04 crc kubenswrapper[4760]: I1204 13:12:04.460202 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68" exitCode=0 Dec 04 13:12:04 crc kubenswrapper[4760]: I1204 13:12:04.460384 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68"} Dec 04 13:12:04 crc kubenswrapper[4760]: I1204 13:12:04.461078 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a"} Dec 04 13:12:04 crc kubenswrapper[4760]: I1204 13:12:04.461107 4760 scope.go:117] "RemoveContainer" containerID="a42a4eaf678b7a207213d93ef419ca6106afbc34e44b6c814bac7948df921184" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.446982 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.450207 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.466412 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.466517 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.466552 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxjc\" (UniqueName: \"kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.467346 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.568730 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.568820 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.568845 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxjc\" (UniqueName: \"kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.569769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.569769 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.604362 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxjc\" (UniqueName: \"kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc\") pod \"community-operators-2h9x8\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:30 crc kubenswrapper[4760]: I1204 13:12:30.788219 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:31 crc kubenswrapper[4760]: I1204 13:12:31.769144 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:32 crc kubenswrapper[4760]: I1204 13:12:32.712099 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerID="c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538" exitCode=0 Dec 04 13:12:32 crc kubenswrapper[4760]: I1204 13:12:32.712286 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerDied","Data":"c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538"} Dec 04 13:12:32 crc kubenswrapper[4760]: I1204 13:12:32.712692 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerStarted","Data":"bf92a9fa13069e5714b67e4eb378c77775e9eb457c6262be2a999066af230333"} Dec 04 13:12:32 crc kubenswrapper[4760]: I1204 13:12:32.715229 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:12:33 crc kubenswrapper[4760]: I1204 13:12:33.725332 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerStarted","Data":"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f"} Dec 04 13:12:34 crc kubenswrapper[4760]: I1204 13:12:34.736522 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerID="bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f" exitCode=0 Dec 04 13:12:34 crc kubenswrapper[4760]: I1204 13:12:34.736599 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerDied","Data":"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f"} Dec 04 13:12:35 crc kubenswrapper[4760]: I1204 13:12:35.748905 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerStarted","Data":"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135"} Dec 04 13:12:40 crc kubenswrapper[4760]: I1204 13:12:40.909762 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:40 crc kubenswrapper[4760]: I1204 13:12:40.910315 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:40 crc kubenswrapper[4760]: I1204 13:12:40.988931 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:41 crc kubenswrapper[4760]: I1204 13:12:41.018656 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2h9x8" podStartSLOduration=8.61159191 podStartE2EDuration="11.018634078s" podCreationTimestamp="2025-12-04 13:12:30 +0000 UTC" firstStartedPulling="2025-12-04 13:12:32.71491893 +0000 UTC m=+3555.756365487" lastFinishedPulling="2025-12-04 13:12:35.121961098 +0000 UTC m=+3558.163407655" observedRunningTime="2025-12-04 13:12:35.768403607 +0000 UTC m=+3558.809850184" watchObservedRunningTime="2025-12-04 13:12:41.018634078 +0000 UTC m=+3564.060080655" Dec 04 13:12:41 crc kubenswrapper[4760]: I1204 13:12:41.983165 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:42 crc kubenswrapper[4760]: I1204 13:12:42.042570 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:43 crc kubenswrapper[4760]: I1204 13:12:43.952991 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2h9x8" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="registry-server" containerID="cri-o://2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135" gracePeriod=2 Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.547581 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.690277 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content\") pod \"5a4b6bb9-132e-4c88-a074-1475306b05b9\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.690500 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities\") pod \"5a4b6bb9-132e-4c88-a074-1475306b05b9\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.690560 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxjc\" (UniqueName: \"kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc\") pod \"5a4b6bb9-132e-4c88-a074-1475306b05b9\" (UID: \"5a4b6bb9-132e-4c88-a074-1475306b05b9\") " Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.691437 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities" (OuterVolumeSpecName: "utilities") pod "5a4b6bb9-132e-4c88-a074-1475306b05b9" (UID: "5a4b6bb9-132e-4c88-a074-1475306b05b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.697123 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc" (OuterVolumeSpecName: "kube-api-access-cxxjc") pod "5a4b6bb9-132e-4c88-a074-1475306b05b9" (UID: "5a4b6bb9-132e-4c88-a074-1475306b05b9"). InnerVolumeSpecName "kube-api-access-cxxjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.745538 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a4b6bb9-132e-4c88-a074-1475306b05b9" (UID: "5a4b6bb9-132e-4c88-a074-1475306b05b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.793523 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.793561 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a4b6bb9-132e-4c88-a074-1475306b05b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.793572 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxjc\" (UniqueName: \"kubernetes.io/projected/5a4b6bb9-132e-4c88-a074-1475306b05b9-kube-api-access-cxxjc\") on node \"crc\" DevicePath \"\"" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.964678 4760 generic.go:334] "Generic (PLEG): container finished" podID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerID="2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135" exitCode=0 Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.964876 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerDied","Data":"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135"} Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.965082 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2h9x8" event={"ID":"5a4b6bb9-132e-4c88-a074-1475306b05b9","Type":"ContainerDied","Data":"bf92a9fa13069e5714b67e4eb378c77775e9eb457c6262be2a999066af230333"} Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.964993 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2h9x8" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.965113 4760 scope.go:117] "RemoveContainer" containerID="2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.991492 4760 scope.go:117] "RemoveContainer" containerID="bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f" Dec 04 13:12:44 crc kubenswrapper[4760]: I1204 13:12:44.999036 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.010426 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2h9x8"] Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.028948 4760 scope.go:117] "RemoveContainer" containerID="c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.068897 4760 scope.go:117] "RemoveContainer" containerID="2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135" Dec 04 13:12:45 crc kubenswrapper[4760]: E1204 13:12:45.069650 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135\": container with ID starting with 2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135 not found: ID does not exist" containerID="2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.069693 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135"} err="failed to get container status \"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135\": rpc error: code = NotFound desc = could not find container \"2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135\": container with ID starting with 2f0e6f703330e2c1984fd53f1c05b12beab0f25588a0c51ee1019892f2738135 not found: ID does not exist" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.069721 4760 scope.go:117] "RemoveContainer" containerID="bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f" Dec 04 13:12:45 crc kubenswrapper[4760]: E1204 13:12:45.070090 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f\": container with ID starting with bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f not found: ID does not exist" containerID="bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.070116 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f"} err="failed to get container status \"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f\": rpc error: code = NotFound desc = could not find container \"bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f\": container with ID starting with bc663445a27167b2c744767beab34aecf4f3e9aeeb5a8761d60a001f28745d0f not found: ID does not exist" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.070133 4760 scope.go:117] "RemoveContainer" containerID="c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538" Dec 04 13:12:45 crc kubenswrapper[4760]: E1204 13:12:45.070445 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538\": container with ID starting with c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538 not found: ID does not exist" containerID="c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.070474 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538"} err="failed to get container status \"c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538\": rpc error: code = NotFound desc = could not find container \"c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538\": container with ID starting with c8eb606cba5a535c6d332d8cdc1535d851781efdb82687014bf8fb4673032538 not found: ID does not exist" Dec 04 13:12:45 crc kubenswrapper[4760]: I1204 13:12:45.875474 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" path="/var/lib/kubelet/pods/5a4b6bb9-132e-4c88-a074-1475306b05b9/volumes" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.351001 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:33 crc kubenswrapper[4760]: E1204 13:13:33.354441 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="extract-content" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.354482 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="extract-content" Dec 04 13:13:33 crc kubenswrapper[4760]: E1204 13:13:33.354519 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="extract-utilities" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.354527 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="extract-utilities" Dec 04 13:13:33 crc kubenswrapper[4760]: E1204 13:13:33.354572 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="registry-server" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.354578 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="registry-server" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.354984 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4b6bb9-132e-4c88-a074-1475306b05b9" containerName="registry-server" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.356942 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.370071 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.465324 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.465590 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.465747 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gskq\" (UniqueName: \"kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.567979 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.568103 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gskq\" (UniqueName: \"kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.568193 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.568603 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.568703 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.593804 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gskq\" (UniqueName: \"kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq\") pod \"certified-operators-6tlwq\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:33 crc kubenswrapper[4760]: I1204 13:13:33.685941 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:34 crc kubenswrapper[4760]: I1204 13:13:34.244595 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:34 crc kubenswrapper[4760]: I1204 13:13:34.431729 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerStarted","Data":"0d853d12d8fb102e4eba09f36fcbc1ad5210cfedbbae7f91b338ad545b584985"} Dec 04 13:13:35 crc kubenswrapper[4760]: I1204 13:13:35.441945 4760 generic.go:334] "Generic (PLEG): container finished" podID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerID="baf33f630685210fc5dbf0276f90c5f5794a338a71d7c9f1e57b7448c3f2f82d" exitCode=0 Dec 04 13:13:35 crc kubenswrapper[4760]: I1204 13:13:35.442005 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerDied","Data":"baf33f630685210fc5dbf0276f90c5f5794a338a71d7c9f1e57b7448c3f2f82d"} Dec 04 13:13:37 crc kubenswrapper[4760]: I1204 13:13:37.465463 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerStarted","Data":"2cebfcabea4e3226bf983f180c2a44f536d2ed5b40cef1ff0ba98ef41675594d"} Dec 04 13:13:37 crc kubenswrapper[4760]: I1204 13:13:37.932918 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:37 crc kubenswrapper[4760]: I1204 13:13:37.937097 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:37 crc kubenswrapper[4760]: I1204 13:13:37.974401 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.075103 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqh7m\" (UniqueName: \"kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.075372 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.075419 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.178033 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.178114 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.178418 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqh7m\" (UniqueName: \"kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.178842 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.178883 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.203169 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqh7m\" (UniqueName: \"kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m\") pod \"redhat-marketplace-47cdn\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.276885 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.506717 4760 generic.go:334] "Generic (PLEG): container finished" podID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerID="2cebfcabea4e3226bf983f180c2a44f536d2ed5b40cef1ff0ba98ef41675594d" exitCode=0 Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.507004 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerDied","Data":"2cebfcabea4e3226bf983f180c2a44f536d2ed5b40cef1ff0ba98ef41675594d"} Dec 04 13:13:38 crc kubenswrapper[4760]: I1204 13:13:38.888779 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:39 crc kubenswrapper[4760]: I1204 13:13:39.535782 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerID="c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3" exitCode=0 Dec 04 13:13:39 crc kubenswrapper[4760]: I1204 13:13:39.536155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerDied","Data":"c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3"} Dec 04 13:13:39 crc kubenswrapper[4760]: I1204 13:13:39.536188 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerStarted","Data":"181b34ccc7cc7525749153e55ea2eb0e1c694443d5594c65ea31a59bff878cc0"} Dec 04 13:13:39 crc kubenswrapper[4760]: I1204 13:13:39.544280 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerStarted","Data":"7aff6f0b24ffc875a490ad465003f8d6070b1ab323bd670a484c0e0b9482ad80"} Dec 04 13:13:39 crc kubenswrapper[4760]: I1204 13:13:39.601012 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tlwq" podStartSLOduration=3.066334981 podStartE2EDuration="6.600992734s" podCreationTimestamp="2025-12-04 13:13:33 +0000 UTC" firstStartedPulling="2025-12-04 13:13:35.444831764 +0000 UTC m=+3618.486278321" lastFinishedPulling="2025-12-04 13:13:38.979489507 +0000 UTC m=+3622.020936074" observedRunningTime="2025-12-04 13:13:39.598494745 +0000 UTC m=+3622.639941312" watchObservedRunningTime="2025-12-04 13:13:39.600992734 +0000 UTC m=+3622.642439301" Dec 04 13:13:41 crc kubenswrapper[4760]: I1204 13:13:41.569030 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerStarted","Data":"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620"} Dec 04 13:13:42 crc kubenswrapper[4760]: I1204 13:13:42.615426 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerID="9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620" exitCode=0 Dec 04 13:13:42 crc kubenswrapper[4760]: I1204 13:13:42.615598 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerDied","Data":"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620"} Dec 04 13:13:43 crc kubenswrapper[4760]: I1204 13:13:43.632906 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerStarted","Data":"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c"} Dec 04 13:13:43 crc kubenswrapper[4760]: I1204 13:13:43.653539 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-47cdn" podStartSLOduration=3.185001711 podStartE2EDuration="6.653508791s" podCreationTimestamp="2025-12-04 13:13:37 +0000 UTC" firstStartedPulling="2025-12-04 13:13:39.542121213 +0000 UTC m=+3622.583567780" lastFinishedPulling="2025-12-04 13:13:43.010628293 +0000 UTC m=+3626.052074860" observedRunningTime="2025-12-04 13:13:43.651826646 +0000 UTC m=+3626.693273213" watchObservedRunningTime="2025-12-04 13:13:43.653508791 +0000 UTC m=+3626.694955358" Dec 04 13:13:43 crc kubenswrapper[4760]: I1204 13:13:43.686255 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:43 crc kubenswrapper[4760]: I1204 13:13:43.686414 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:44 crc kubenswrapper[4760]: I1204 13:13:44.733829 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6tlwq" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="registry-server" probeResult="failure" output=< Dec 04 13:13:44 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:13:44 crc kubenswrapper[4760]: > Dec 04 13:13:48 crc kubenswrapper[4760]: I1204 13:13:48.278001 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:48 crc kubenswrapper[4760]: I1204 13:13:48.278529 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:48 crc kubenswrapper[4760]: I1204 13:13:48.326502 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:48 crc kubenswrapper[4760]: I1204 13:13:48.740899 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:48 crc kubenswrapper[4760]: I1204 13:13:48.799256 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:50 crc kubenswrapper[4760]: I1204 13:13:50.694705 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-47cdn" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="registry-server" containerID="cri-o://255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c" gracePeriod=2 Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.394721 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.599028 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqh7m\" (UniqueName: \"kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m\") pod \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.599242 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities\") pod \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.599346 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content\") pod \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\" (UID: \"5c98c9c4-2ab7-41a2-8fc9-b809b7283591\") " Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.599833 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities" (OuterVolumeSpecName: "utilities") pod "5c98c9c4-2ab7-41a2-8fc9-b809b7283591" (UID: "5c98c9c4-2ab7-41a2-8fc9-b809b7283591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.600146 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.612021 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m" (OuterVolumeSpecName: "kube-api-access-sqh7m") pod "5c98c9c4-2ab7-41a2-8fc9-b809b7283591" (UID: "5c98c9c4-2ab7-41a2-8fc9-b809b7283591"). InnerVolumeSpecName "kube-api-access-sqh7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.621071 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c98c9c4-2ab7-41a2-8fc9-b809b7283591" (UID: "5c98c9c4-2ab7-41a2-8fc9-b809b7283591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.716539 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqh7m\" (UniqueName: \"kubernetes.io/projected/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-kube-api-access-sqh7m\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.716596 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c98c9c4-2ab7-41a2-8fc9-b809b7283591-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.719978 4760 generic.go:334] "Generic (PLEG): container finished" podID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerID="255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c" exitCode=0 Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.720023 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerDied","Data":"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c"} Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.720052 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47cdn" event={"ID":"5c98c9c4-2ab7-41a2-8fc9-b809b7283591","Type":"ContainerDied","Data":"181b34ccc7cc7525749153e55ea2eb0e1c694443d5594c65ea31a59bff878cc0"} Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.720071 4760 scope.go:117] "RemoveContainer" containerID="255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.720566 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47cdn" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.745544 4760 scope.go:117] "RemoveContainer" containerID="9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.780006 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.793735 4760 scope.go:117] "RemoveContainer" containerID="c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.800956 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-47cdn"] Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.841892 4760 scope.go:117] "RemoveContainer" containerID="255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c" Dec 04 13:13:51 crc kubenswrapper[4760]: E1204 13:13:51.842702 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c\": container with ID starting with 255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c not found: ID does not exist" containerID="255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.842738 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c"} err="failed to get container status \"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c\": rpc error: code = NotFound desc = could not find container \"255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c\": container with ID starting with 255098132a41a8cf584c42c5a3b19864bfa5a51fe62a83c36934486de74df64c not found: ID does not exist" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.842768 4760 scope.go:117] "RemoveContainer" containerID="9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620" Dec 04 13:13:51 crc kubenswrapper[4760]: E1204 13:13:51.843019 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620\": container with ID starting with 9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620 not found: ID does not exist" containerID="9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.843043 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620"} err="failed to get container status \"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620\": rpc error: code = NotFound desc = could not find container \"9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620\": container with ID starting with 9015c01ad31c88798b0754b4006ee83dc8646ba40966cf6bebc66df78c156620 not found: ID does not exist" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.843060 4760 scope.go:117] "RemoveContainer" containerID="c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3" Dec 04 13:13:51 crc kubenswrapper[4760]: E1204 13:13:51.843448 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3\": container with ID starting with c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3 not found: ID does not exist" containerID="c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.843477 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3"} err="failed to get container status \"c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3\": rpc error: code = NotFound desc = could not find container \"c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3\": container with ID starting with c93297530e053ef14183f06dd1bbeeede2952cdb404a4cfb176f5f312bc785f3 not found: ID does not exist" Dec 04 13:13:51 crc kubenswrapper[4760]: I1204 13:13:51.879904 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" path="/var/lib/kubelet/pods/5c98c9c4-2ab7-41a2-8fc9-b809b7283591/volumes" Dec 04 13:13:53 crc kubenswrapper[4760]: I1204 13:13:53.739695 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:53 crc kubenswrapper[4760]: I1204 13:13:53.803794 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:54 crc kubenswrapper[4760]: I1204 13:13:54.967011 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:55 crc kubenswrapper[4760]: I1204 13:13:55.755017 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tlwq" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="registry-server" containerID="cri-o://7aff6f0b24ffc875a490ad465003f8d6070b1ab323bd670a484c0e0b9482ad80" gracePeriod=2 Dec 04 13:13:56 crc kubenswrapper[4760]: I1204 13:13:56.771492 4760 generic.go:334] "Generic (PLEG): container finished" podID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerID="7aff6f0b24ffc875a490ad465003f8d6070b1ab323bd670a484c0e0b9482ad80" exitCode=0 Dec 04 13:13:56 crc kubenswrapper[4760]: I1204 13:13:56.771561 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerDied","Data":"7aff6f0b24ffc875a490ad465003f8d6070b1ab323bd670a484c0e0b9482ad80"} Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.065390 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.178279 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities\") pod \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.178414 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content\") pod \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.178552 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gskq\" (UniqueName: \"kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq\") pod \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\" (UID: \"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4\") " Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.179273 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities" (OuterVolumeSpecName: "utilities") pod "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" (UID: "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.184823 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq" (OuterVolumeSpecName: "kube-api-access-2gskq") pod "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" (UID: "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4"). InnerVolumeSpecName "kube-api-access-2gskq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.223908 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" (UID: "9c7e26e3-8815-4e2f-93c2-bf2ba739ded4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.280785 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.280822 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.280836 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gskq\" (UniqueName: \"kubernetes.io/projected/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4-kube-api-access-2gskq\") on node \"crc\" DevicePath \"\"" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.783713 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlwq" event={"ID":"9c7e26e3-8815-4e2f-93c2-bf2ba739ded4","Type":"ContainerDied","Data":"0d853d12d8fb102e4eba09f36fcbc1ad5210cfedbbae7f91b338ad545b584985"} Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.783779 4760 scope.go:117] "RemoveContainer" containerID="7aff6f0b24ffc875a490ad465003f8d6070b1ab323bd670a484c0e0b9482ad80" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.783779 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlwq" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.821153 4760 scope.go:117] "RemoveContainer" containerID="2cebfcabea4e3226bf983f180c2a44f536d2ed5b40cef1ff0ba98ef41675594d" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.834118 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.845577 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tlwq"] Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.853845 4760 scope.go:117] "RemoveContainer" containerID="baf33f630685210fc5dbf0276f90c5f5794a338a71d7c9f1e57b7448c3f2f82d" Dec 04 13:13:57 crc kubenswrapper[4760]: I1204 13:13:57.880296 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" path="/var/lib/kubelet/pods/9c7e26e3-8815-4e2f-93c2-bf2ba739ded4/volumes" Dec 04 13:14:03 crc kubenswrapper[4760]: I1204 13:14:03.385160 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:14:03 crc kubenswrapper[4760]: I1204 13:14:03.385783 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.031139 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.031982 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.031996 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.032011 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="extract-content" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032017 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="extract-content" Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.032026 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032034 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.032049 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="extract-utilities" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032056 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="extract-utilities" Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.032063 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="extract-content" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032069 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="extract-content" Dec 04 13:14:07 crc kubenswrapper[4760]: E1204 13:14:07.032104 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="extract-utilities" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032110 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="extract-utilities" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032323 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c98c9c4-2ab7-41a2-8fc9-b809b7283591" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.032342 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7e26e3-8815-4e2f-93c2-bf2ba739ded4" containerName="registry-server" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.035986 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.049511 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.224426 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.224521 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrz4w\" (UniqueName: \"kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.224721 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.326861 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.327008 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.327059 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrz4w\" (UniqueName: \"kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.327955 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.328023 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.351520 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrz4w\" (UniqueName: \"kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w\") pod \"redhat-operators-rlb6w\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.365388 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:07 crc kubenswrapper[4760]: I1204 13:14:07.906236 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:08 crc kubenswrapper[4760]: I1204 13:14:08.910361 4760 generic.go:334] "Generic (PLEG): container finished" podID="01263a5f-bc8b-472c-b640-642622a02148" containerID="7aa916aa2d9f687b390623c39c5483e9e5d8fd57911cf0cca1c0d107bdcd8fbb" exitCode=0 Dec 04 13:14:08 crc kubenswrapper[4760]: I1204 13:14:08.910470 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerDied","Data":"7aa916aa2d9f687b390623c39c5483e9e5d8fd57911cf0cca1c0d107bdcd8fbb"} Dec 04 13:14:08 crc kubenswrapper[4760]: I1204 13:14:08.910657 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerStarted","Data":"c8da2ee2654163467469194556a14d9e7a4c59c49634869514107d45f94d2295"} Dec 04 13:14:09 crc kubenswrapper[4760]: I1204 13:14:09.926572 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerStarted","Data":"876eb3c9172ed0d1d9e030cfa435b19094e6bfce9f64de4657b0135209d780f9"} Dec 04 13:14:15 crc kubenswrapper[4760]: I1204 13:14:15.056145 4760 generic.go:334] "Generic (PLEG): container finished" podID="01263a5f-bc8b-472c-b640-642622a02148" containerID="876eb3c9172ed0d1d9e030cfa435b19094e6bfce9f64de4657b0135209d780f9" exitCode=0 Dec 04 13:14:15 crc kubenswrapper[4760]: I1204 13:14:15.056239 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerDied","Data":"876eb3c9172ed0d1d9e030cfa435b19094e6bfce9f64de4657b0135209d780f9"} Dec 04 13:14:16 crc kubenswrapper[4760]: I1204 13:14:16.068195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerStarted","Data":"eb69d2cdf2d8d1dd8ddbb6d12d3db4268cac17db255dcbd20c7fa9fbea0253ab"} Dec 04 13:14:17 crc kubenswrapper[4760]: I1204 13:14:17.097076 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rlb6w" podStartSLOduration=4.342249966 podStartE2EDuration="11.097055285s" podCreationTimestamp="2025-12-04 13:14:06 +0000 UTC" firstStartedPulling="2025-12-04 13:14:08.912545727 +0000 UTC m=+3651.953992294" lastFinishedPulling="2025-12-04 13:14:15.667351046 +0000 UTC m=+3658.708797613" observedRunningTime="2025-12-04 13:14:17.094121642 +0000 UTC m=+3660.135568209" watchObservedRunningTime="2025-12-04 13:14:17.097055285 +0000 UTC m=+3660.138501852" Dec 04 13:14:17 crc kubenswrapper[4760]: I1204 13:14:17.366526 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:17 crc kubenswrapper[4760]: I1204 13:14:17.366591 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:18 crc kubenswrapper[4760]: I1204 13:14:18.415144 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rlb6w" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="registry-server" probeResult="failure" output=< Dec 04 13:14:18 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:14:18 crc kubenswrapper[4760]: > Dec 04 13:14:27 crc kubenswrapper[4760]: I1204 13:14:27.418660 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:27 crc kubenswrapper[4760]: I1204 13:14:27.481125 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:27 crc kubenswrapper[4760]: I1204 13:14:27.663297 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:29 crc kubenswrapper[4760]: I1204 13:14:29.199331 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rlb6w" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="registry-server" containerID="cri-o://eb69d2cdf2d8d1dd8ddbb6d12d3db4268cac17db255dcbd20c7fa9fbea0253ab" gracePeriod=2 Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.261139 4760 generic.go:334] "Generic (PLEG): container finished" podID="01263a5f-bc8b-472c-b640-642622a02148" containerID="eb69d2cdf2d8d1dd8ddbb6d12d3db4268cac17db255dcbd20c7fa9fbea0253ab" exitCode=0 Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.261172 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerDied","Data":"eb69d2cdf2d8d1dd8ddbb6d12d3db4268cac17db255dcbd20c7fa9fbea0253ab"} Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.591396 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.734662 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content\") pod \"01263a5f-bc8b-472c-b640-642622a02148\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.734748 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrz4w\" (UniqueName: \"kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w\") pod \"01263a5f-bc8b-472c-b640-642622a02148\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.735135 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities\") pod \"01263a5f-bc8b-472c-b640-642622a02148\" (UID: \"01263a5f-bc8b-472c-b640-642622a02148\") " Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.736544 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities" (OuterVolumeSpecName: "utilities") pod "01263a5f-bc8b-472c-b640-642622a02148" (UID: "01263a5f-bc8b-472c-b640-642622a02148"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.749530 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w" (OuterVolumeSpecName: "kube-api-access-xrz4w") pod "01263a5f-bc8b-472c-b640-642622a02148" (UID: "01263a5f-bc8b-472c-b640-642622a02148"). InnerVolumeSpecName "kube-api-access-xrz4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.837658 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.837721 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrz4w\" (UniqueName: \"kubernetes.io/projected/01263a5f-bc8b-472c-b640-642622a02148-kube-api-access-xrz4w\") on node \"crc\" DevicePath \"\"" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.861855 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01263a5f-bc8b-472c-b640-642622a02148" (UID: "01263a5f-bc8b-472c-b640-642622a02148"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:14:30 crc kubenswrapper[4760]: I1204 13:14:30.942496 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01263a5f-bc8b-472c-b640-642622a02148-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.278128 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rlb6w" event={"ID":"01263a5f-bc8b-472c-b640-642622a02148","Type":"ContainerDied","Data":"c8da2ee2654163467469194556a14d9e7a4c59c49634869514107d45f94d2295"} Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.279083 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rlb6w" Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.279095 4760 scope.go:117] "RemoveContainer" containerID="eb69d2cdf2d8d1dd8ddbb6d12d3db4268cac17db255dcbd20c7fa9fbea0253ab" Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.327347 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.330556 4760 scope.go:117] "RemoveContainer" containerID="876eb3c9172ed0d1d9e030cfa435b19094e6bfce9f64de4657b0135209d780f9" Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.336861 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rlb6w"] Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.381353 4760 scope.go:117] "RemoveContainer" containerID="7aa916aa2d9f687b390623c39c5483e9e5d8fd57911cf0cca1c0d107bdcd8fbb" Dec 04 13:14:31 crc kubenswrapper[4760]: I1204 13:14:31.878904 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01263a5f-bc8b-472c-b640-642622a02148" path="/var/lib/kubelet/pods/01263a5f-bc8b-472c-b640-642622a02148/volumes" Dec 04 13:14:33 crc kubenswrapper[4760]: I1204 13:14:33.380839 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:14:33 crc kubenswrapper[4760]: I1204 13:14:33.381237 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.179554 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557"] Dec 04 13:15:00 crc kubenswrapper[4760]: E1204 13:15:00.180904 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="extract-content" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.180931 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="extract-content" Dec 04 13:15:00 crc kubenswrapper[4760]: E1204 13:15:00.180991 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="extract-utilities" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.181003 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="extract-utilities" Dec 04 13:15:00 crc kubenswrapper[4760]: E1204 13:15:00.181027 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="registry-server" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.181036 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="registry-server" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.181382 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="01263a5f-bc8b-472c-b640-642622a02148" containerName="registry-server" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.182330 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.185950 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.186404 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.193011 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.193576 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.193705 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djrq2\" (UniqueName: \"kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.198586 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557"] Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.296059 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.296173 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djrq2\" (UniqueName: \"kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.296296 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.297506 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.305113 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.323103 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djrq2\" (UniqueName: \"kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2\") pod \"collect-profiles-29414235-gn557\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:00 crc kubenswrapper[4760]: I1204 13:15:00.512677 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:01.039159 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557"] Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:01.658032 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" event={"ID":"0e53000b-5944-4439-8991-8f8a2337fb31","Type":"ContainerStarted","Data":"ade4aef42f7999b409dac4b4e616a67c3460bd369e8d64ac818a00ad17eb25ad"} Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:01.658448 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" event={"ID":"0e53000b-5944-4439-8991-8f8a2337fb31","Type":"ContainerStarted","Data":"e323e5ba42fbca43e3deab9fdfa7f3e2a488cb1d16d461a3843d3d1d9d3a4eaa"} Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:01.689318 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" podStartSLOduration=1.689290374 podStartE2EDuration="1.689290374s" podCreationTimestamp="2025-12-04 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 13:15:01.683161219 +0000 UTC m=+3704.724607786" watchObservedRunningTime="2025-12-04 13:15:01.689290374 +0000 UTC m=+3704.730736941" Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:02.669518 4760 generic.go:334] "Generic (PLEG): container finished" podID="0e53000b-5944-4439-8991-8f8a2337fb31" containerID="ade4aef42f7999b409dac4b4e616a67c3460bd369e8d64ac818a00ad17eb25ad" exitCode=0 Dec 04 13:15:02 crc kubenswrapper[4760]: I1204 13:15:02.669573 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" event={"ID":"0e53000b-5944-4439-8991-8f8a2337fb31","Type":"ContainerDied","Data":"ade4aef42f7999b409dac4b4e616a67c3460bd369e8d64ac818a00ad17eb25ad"} Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.380146 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.380235 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.380293 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.381377 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.381444 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" gracePeriod=600 Dec 04 13:15:03 crc kubenswrapper[4760]: E1204 13:15:03.515300 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.684855 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" exitCode=0 Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.684881 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a"} Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.684976 4760 scope.go:117] "RemoveContainer" containerID="1129d67412d0397ffe3f601030fdfb33f4df842956c78105cfe22db28621df68" Dec 04 13:15:03 crc kubenswrapper[4760]: I1204 13:15:03.686025 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:15:03 crc kubenswrapper[4760]: E1204 13:15:03.686529 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.277872 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.303714 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume\") pod \"0e53000b-5944-4439-8991-8f8a2337fb31\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.303795 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djrq2\" (UniqueName: \"kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2\") pod \"0e53000b-5944-4439-8991-8f8a2337fb31\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.303969 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume\") pod \"0e53000b-5944-4439-8991-8f8a2337fb31\" (UID: \"0e53000b-5944-4439-8991-8f8a2337fb31\") " Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.304811 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e53000b-5944-4439-8991-8f8a2337fb31" (UID: "0e53000b-5944-4439-8991-8f8a2337fb31"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.305208 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e53000b-5944-4439-8991-8f8a2337fb31-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.314005 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e53000b-5944-4439-8991-8f8a2337fb31" (UID: "0e53000b-5944-4439-8991-8f8a2337fb31"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.335735 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2" (OuterVolumeSpecName: "kube-api-access-djrq2") pod "0e53000b-5944-4439-8991-8f8a2337fb31" (UID: "0e53000b-5944-4439-8991-8f8a2337fb31"). InnerVolumeSpecName "kube-api-access-djrq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.407691 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djrq2\" (UniqueName: \"kubernetes.io/projected/0e53000b-5944-4439-8991-8f8a2337fb31-kube-api-access-djrq2\") on node \"crc\" DevicePath \"\"" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.407747 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e53000b-5944-4439-8991-8f8a2337fb31-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.696978 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" event={"ID":"0e53000b-5944-4439-8991-8f8a2337fb31","Type":"ContainerDied","Data":"e323e5ba42fbca43e3deab9fdfa7f3e2a488cb1d16d461a3843d3d1d9d3a4eaa"} Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.697030 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e323e5ba42fbca43e3deab9fdfa7f3e2a488cb1d16d461a3843d3d1d9d3a4eaa" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.697063 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557" Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.771413 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4"] Dec 04 13:15:04 crc kubenswrapper[4760]: I1204 13:15:04.783659 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414190-f4ml4"] Dec 04 13:15:05 crc kubenswrapper[4760]: I1204 13:15:05.876795 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27f1467-5844-498c-ab06-fdb1379f24b4" path="/var/lib/kubelet/pods/d27f1467-5844-498c-ab06-fdb1379f24b4/volumes" Dec 04 13:15:15 crc kubenswrapper[4760]: I1204 13:15:15.864557 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:15:15 crc kubenswrapper[4760]: E1204 13:15:15.865391 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:15:27 crc kubenswrapper[4760]: I1204 13:15:27.872267 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:15:27 crc kubenswrapper[4760]: E1204 13:15:27.873134 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:15:42 crc kubenswrapper[4760]: I1204 13:15:42.881550 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:15:42 crc kubenswrapper[4760]: E1204 13:15:42.882365 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:15:56 crc kubenswrapper[4760]: I1204 13:15:56.864383 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:15:56 crc kubenswrapper[4760]: E1204 13:15:56.865164 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:16:01 crc kubenswrapper[4760]: I1204 13:16:01.131847 4760 scope.go:117] "RemoveContainer" containerID="c976613583e16dd96390021f6bbbdeb2fcfdec7f23f7faefff7df9771fafb812" Dec 04 13:16:07 crc kubenswrapper[4760]: I1204 13:16:07.880665 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:16:07 crc kubenswrapper[4760]: E1204 13:16:07.882355 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:16:20 crc kubenswrapper[4760]: I1204 13:16:20.865505 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:16:20 crc kubenswrapper[4760]: E1204 13:16:20.866417 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:16:31 crc kubenswrapper[4760]: I1204 13:16:31.864477 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:16:31 crc kubenswrapper[4760]: E1204 13:16:31.865293 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:16:46 crc kubenswrapper[4760]: I1204 13:16:46.864646 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:16:46 crc kubenswrapper[4760]: E1204 13:16:46.865479 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:16:59 crc kubenswrapper[4760]: I1204 13:16:59.865552 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:16:59 crc kubenswrapper[4760]: E1204 13:16:59.866504 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:17:14 crc kubenswrapper[4760]: I1204 13:17:14.864978 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:17:14 crc kubenswrapper[4760]: E1204 13:17:14.865868 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:17:26 crc kubenswrapper[4760]: I1204 13:17:26.864164 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:17:26 crc kubenswrapper[4760]: E1204 13:17:26.865230 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:17:40 crc kubenswrapper[4760]: I1204 13:17:40.865567 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:17:40 crc kubenswrapper[4760]: E1204 13:17:40.866884 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:17:54 crc kubenswrapper[4760]: I1204 13:17:54.865452 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:17:54 crc kubenswrapper[4760]: E1204 13:17:54.866238 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:18:07 crc kubenswrapper[4760]: I1204 13:18:07.871768 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:18:07 crc kubenswrapper[4760]: E1204 13:18:07.872788 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:18:20 crc kubenswrapper[4760]: I1204 13:18:20.864787 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:18:20 crc kubenswrapper[4760]: E1204 13:18:20.865698 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:18:35 crc kubenswrapper[4760]: I1204 13:18:35.864474 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:18:35 crc kubenswrapper[4760]: E1204 13:18:35.865481 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:18:47 crc kubenswrapper[4760]: I1204 13:18:47.873131 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:18:47 crc kubenswrapper[4760]: E1204 13:18:47.874048 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:19:01 crc kubenswrapper[4760]: I1204 13:19:01.865330 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:19:01 crc kubenswrapper[4760]: E1204 13:19:01.866279 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:19:15 crc kubenswrapper[4760]: I1204 13:19:15.864911 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:19:15 crc kubenswrapper[4760]: E1204 13:19:15.865997 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:19:29 crc kubenswrapper[4760]: I1204 13:19:29.867615 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:19:29 crc kubenswrapper[4760]: E1204 13:19:29.868516 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:19:42 crc kubenswrapper[4760]: I1204 13:19:42.867554 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:19:42 crc kubenswrapper[4760]: E1204 13:19:42.868314 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:19:54 crc kubenswrapper[4760]: I1204 13:19:54.864692 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:19:54 crc kubenswrapper[4760]: E1204 13:19:54.866751 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:20:05 crc kubenswrapper[4760]: I1204 13:20:05.864679 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:20:06 crc kubenswrapper[4760]: I1204 13:20:06.307615 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9"} Dec 04 13:22:33 crc kubenswrapper[4760]: I1204 13:22:33.380404 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:22:33 crc kubenswrapper[4760]: I1204 13:22:33.381011 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.603456 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:36 crc kubenswrapper[4760]: E1204 13:22:36.604817 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e53000b-5944-4439-8991-8f8a2337fb31" containerName="collect-profiles" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.604835 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e53000b-5944-4439-8991-8f8a2337fb31" containerName="collect-profiles" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.605043 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e53000b-5944-4439-8991-8f8a2337fb31" containerName="collect-profiles" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.606915 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.621297 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.807863 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c56b\" (UniqueName: \"kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.808520 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.808685 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.910848 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.910925 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.911462 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.911530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.912835 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c56b\" (UniqueName: \"kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:36 crc kubenswrapper[4760]: I1204 13:22:36.940068 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c56b\" (UniqueName: \"kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b\") pod \"community-operators-xvzrm\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:37 crc kubenswrapper[4760]: I1204 13:22:37.237835 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:37 crc kubenswrapper[4760]: I1204 13:22:37.776285 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:38 crc kubenswrapper[4760]: I1204 13:22:38.187377 4760 generic.go:334] "Generic (PLEG): container finished" podID="75dade06-1eb9-44f4-b348-4560abe14f62" containerID="73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400" exitCode=0 Dec 04 13:22:38 crc kubenswrapper[4760]: I1204 13:22:38.187499 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerDied","Data":"73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400"} Dec 04 13:22:38 crc kubenswrapper[4760]: I1204 13:22:38.187730 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerStarted","Data":"afad6f95424248a9daeca508291a1d0e823643c3f1d35087e54655b3a23ee405"} Dec 04 13:22:38 crc kubenswrapper[4760]: I1204 13:22:38.189530 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:22:39 crc kubenswrapper[4760]: I1204 13:22:39.201629 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerStarted","Data":"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047"} Dec 04 13:22:40 crc kubenswrapper[4760]: I1204 13:22:40.213702 4760 generic.go:334] "Generic (PLEG): container finished" podID="75dade06-1eb9-44f4-b348-4560abe14f62" containerID="8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047" exitCode=0 Dec 04 13:22:40 crc kubenswrapper[4760]: I1204 13:22:40.213774 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerDied","Data":"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047"} Dec 04 13:22:42 crc kubenswrapper[4760]: I1204 13:22:42.249436 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerStarted","Data":"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc"} Dec 04 13:22:42 crc kubenswrapper[4760]: I1204 13:22:42.286904 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvzrm" podStartSLOduration=3.408547125 podStartE2EDuration="6.28687534s" podCreationTimestamp="2025-12-04 13:22:36 +0000 UTC" firstStartedPulling="2025-12-04 13:22:38.189283454 +0000 UTC m=+4161.230730021" lastFinishedPulling="2025-12-04 13:22:41.067611679 +0000 UTC m=+4164.109058236" observedRunningTime="2025-12-04 13:22:42.285731985 +0000 UTC m=+4165.327178562" watchObservedRunningTime="2025-12-04 13:22:42.28687534 +0000 UTC m=+4165.328321907" Dec 04 13:22:47 crc kubenswrapper[4760]: I1204 13:22:47.238479 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:47 crc kubenswrapper[4760]: I1204 13:22:47.239021 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:47 crc kubenswrapper[4760]: I1204 13:22:47.300463 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:47 crc kubenswrapper[4760]: I1204 13:22:47.350905 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:48 crc kubenswrapper[4760]: I1204 13:22:48.388751 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:49 crc kubenswrapper[4760]: I1204 13:22:49.314198 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvzrm" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="registry-server" containerID="cri-o://f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc" gracePeriod=2 Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.052080 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.234977 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities\") pod \"75dade06-1eb9-44f4-b348-4560abe14f62\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.235762 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities" (OuterVolumeSpecName: "utilities") pod "75dade06-1eb9-44f4-b348-4560abe14f62" (UID: "75dade06-1eb9-44f4-b348-4560abe14f62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.235931 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content\") pod \"75dade06-1eb9-44f4-b348-4560abe14f62\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.245507 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c56b\" (UniqueName: \"kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b\") pod \"75dade06-1eb9-44f4-b348-4560abe14f62\" (UID: \"75dade06-1eb9-44f4-b348-4560abe14f62\") " Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.246486 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.252463 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b" (OuterVolumeSpecName: "kube-api-access-6c56b") pod "75dade06-1eb9-44f4-b348-4560abe14f62" (UID: "75dade06-1eb9-44f4-b348-4560abe14f62"). InnerVolumeSpecName "kube-api-access-6c56b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.289136 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75dade06-1eb9-44f4-b348-4560abe14f62" (UID: "75dade06-1eb9-44f4-b348-4560abe14f62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.325646 4760 generic.go:334] "Generic (PLEG): container finished" podID="75dade06-1eb9-44f4-b348-4560abe14f62" containerID="f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc" exitCode=0 Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.325717 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvzrm" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.325712 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerDied","Data":"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc"} Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.326127 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvzrm" event={"ID":"75dade06-1eb9-44f4-b348-4560abe14f62","Type":"ContainerDied","Data":"afad6f95424248a9daeca508291a1d0e823643c3f1d35087e54655b3a23ee405"} Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.326150 4760 scope.go:117] "RemoveContainer" containerID="f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.348585 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75dade06-1eb9-44f4-b348-4560abe14f62-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.348625 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c56b\" (UniqueName: \"kubernetes.io/projected/75dade06-1eb9-44f4-b348-4560abe14f62-kube-api-access-6c56b\") on node \"crc\" DevicePath \"\"" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.351581 4760 scope.go:117] "RemoveContainer" containerID="8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.381265 4760 scope.go:117] "RemoveContainer" containerID="73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.389339 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.392543 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvzrm"] Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.420878 4760 scope.go:117] "RemoveContainer" containerID="f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc" Dec 04 13:22:50 crc kubenswrapper[4760]: E1204 13:22:50.421352 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc\": container with ID starting with f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc not found: ID does not exist" containerID="f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.421445 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc"} err="failed to get container status \"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc\": rpc error: code = NotFound desc = could not find container \"f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc\": container with ID starting with f6f5941aabd1815bf5a3eaa5979a0ff08611f4a783f36b0ce6edd7e449785ebc not found: ID does not exist" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.421484 4760 scope.go:117] "RemoveContainer" containerID="8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047" Dec 04 13:22:50 crc kubenswrapper[4760]: E1204 13:22:50.421867 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047\": container with ID starting with 8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047 not found: ID does not exist" containerID="8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.421906 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047"} err="failed to get container status \"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047\": rpc error: code = NotFound desc = could not find container \"8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047\": container with ID starting with 8a71b916e7c3fee6abc32bb5c69f97863a9ceee3e14c97c757d93ee1e45d5047 not found: ID does not exist" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.421931 4760 scope.go:117] "RemoveContainer" containerID="73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400" Dec 04 13:22:50 crc kubenswrapper[4760]: E1204 13:22:50.422322 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400\": container with ID starting with 73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400 not found: ID does not exist" containerID="73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400" Dec 04 13:22:50 crc kubenswrapper[4760]: I1204 13:22:50.422425 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400"} err="failed to get container status \"73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400\": rpc error: code = NotFound desc = could not find container \"73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400\": container with ID starting with 73c093c3d85f2e1c2c3f1f88af4545c74507cd39bc64cdeaee60b1a692af8400 not found: ID does not exist" Dec 04 13:22:51 crc kubenswrapper[4760]: I1204 13:22:51.889112 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" path="/var/lib/kubelet/pods/75dade06-1eb9-44f4-b348-4560abe14f62/volumes" Dec 04 13:23:03 crc kubenswrapper[4760]: I1204 13:23:03.380509 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:23:03 crc kubenswrapper[4760]: I1204 13:23:03.381172 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.380980 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.381834 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.381905 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.383066 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.383131 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9" gracePeriod=600 Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.767257 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9" exitCode=0 Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.767350 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9"} Dec 04 13:23:33 crc kubenswrapper[4760]: I1204 13:23:33.767654 4760 scope.go:117] "RemoveContainer" containerID="5b3677680012f5a4249dcf8d6130da6e7791cc7679fe6dc8261cbd483418883a" Dec 04 13:23:34 crc kubenswrapper[4760]: I1204 13:23:34.779137 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337"} Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.347363 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:24:49 crc kubenswrapper[4760]: E1204 13:24:49.348401 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="extract-content" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.348424 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="extract-content" Dec 04 13:24:49 crc kubenswrapper[4760]: E1204 13:24:49.348460 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="extract-utilities" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.348467 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="extract-utilities" Dec 04 13:24:49 crc kubenswrapper[4760]: E1204 13:24:49.348488 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="registry-server" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.348497 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="registry-server" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.348689 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="75dade06-1eb9-44f4-b348-4560abe14f62" containerName="registry-server" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.350446 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.365413 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.529912 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.529959 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.530682 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4jd\" (UniqueName: \"kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.632337 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4jd\" (UniqueName: \"kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.632406 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.632429 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.633104 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.633143 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.657759 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4jd\" (UniqueName: \"kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd\") pod \"redhat-operators-hw64h\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:49 crc kubenswrapper[4760]: I1204 13:24:49.675133 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:50 crc kubenswrapper[4760]: I1204 13:24:50.566945 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:24:50 crc kubenswrapper[4760]: I1204 13:24:50.717155 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerStarted","Data":"2b08ed69917f6af6ca6ca396c6f277afb29eeda8b4f8719853063f3d3c93c631"} Dec 04 13:24:51 crc kubenswrapper[4760]: I1204 13:24:51.753899 4760 generic.go:334] "Generic (PLEG): container finished" podID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerID="968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124" exitCode=0 Dec 04 13:24:51 crc kubenswrapper[4760]: I1204 13:24:51.755413 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerDied","Data":"968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124"} Dec 04 13:24:52 crc kubenswrapper[4760]: I1204 13:24:52.770486 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerStarted","Data":"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c"} Dec 04 13:24:54 crc kubenswrapper[4760]: I1204 13:24:54.792024 4760 generic.go:334] "Generic (PLEG): container finished" podID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerID="007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c" exitCode=0 Dec 04 13:24:54 crc kubenswrapper[4760]: I1204 13:24:54.792078 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerDied","Data":"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c"} Dec 04 13:24:55 crc kubenswrapper[4760]: I1204 13:24:55.812801 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerStarted","Data":"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226"} Dec 04 13:24:55 crc kubenswrapper[4760]: I1204 13:24:55.841829 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hw64h" podStartSLOduration=3.255606223 podStartE2EDuration="6.841778969s" podCreationTimestamp="2025-12-04 13:24:49 +0000 UTC" firstStartedPulling="2025-12-04 13:24:51.757836008 +0000 UTC m=+4294.799282575" lastFinishedPulling="2025-12-04 13:24:55.344008754 +0000 UTC m=+4298.385455321" observedRunningTime="2025-12-04 13:24:55.834062435 +0000 UTC m=+4298.875509022" watchObservedRunningTime="2025-12-04 13:24:55.841778969 +0000 UTC m=+4298.883225536" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.677246 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.677813 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.798512 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.801010 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.809554 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.948669 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.948733 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27z2f\" (UniqueName: \"kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:24:59 crc kubenswrapper[4760]: I1204 13:24:59.948806 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.051263 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.051912 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27z2f\" (UniqueName: \"kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.052146 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.052781 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.052832 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.083390 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27z2f\" (UniqueName: \"kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f\") pod \"redhat-marketplace-5drhf\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.144044 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.736341 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hw64h" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="registry-server" probeResult="failure" output=< Dec 04 13:25:00 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:25:00 crc kubenswrapper[4760]: > Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.785959 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:25:00 crc kubenswrapper[4760]: I1204 13:25:00.895168 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerStarted","Data":"65c31c60c8932da4f2b3d753efcf17d8b77e1c6b2f98e494ca5a382a72bb7b04"} Dec 04 13:25:01 crc kubenswrapper[4760]: I1204 13:25:01.923760 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26f55d3-5077-4250-bc46-28497736e960" containerID="da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671" exitCode=0 Dec 04 13:25:01 crc kubenswrapper[4760]: I1204 13:25:01.924065 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerDied","Data":"da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671"} Dec 04 13:25:02 crc kubenswrapper[4760]: I1204 13:25:02.936871 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerStarted","Data":"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20"} Dec 04 13:25:03 crc kubenswrapper[4760]: I1204 13:25:03.948561 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26f55d3-5077-4250-bc46-28497736e960" containerID="39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20" exitCode=0 Dec 04 13:25:03 crc kubenswrapper[4760]: I1204 13:25:03.948724 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerDied","Data":"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20"} Dec 04 13:25:05 crc kubenswrapper[4760]: I1204 13:25:05.980715 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerStarted","Data":"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb"} Dec 04 13:25:06 crc kubenswrapper[4760]: I1204 13:25:06.004059 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5drhf" podStartSLOduration=4.019567023 podStartE2EDuration="7.00403443s" podCreationTimestamp="2025-12-04 13:24:59 +0000 UTC" firstStartedPulling="2025-12-04 13:25:01.9426567 +0000 UTC m=+4304.984103267" lastFinishedPulling="2025-12-04 13:25:04.927124107 +0000 UTC m=+4307.968570674" observedRunningTime="2025-12-04 13:25:05.999062833 +0000 UTC m=+4309.040509400" watchObservedRunningTime="2025-12-04 13:25:06.00403443 +0000 UTC m=+4309.045480997" Dec 04 13:25:09 crc kubenswrapper[4760]: I1204 13:25:09.727420 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:25:09 crc kubenswrapper[4760]: I1204 13:25:09.778180 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:25:09 crc kubenswrapper[4760]: I1204 13:25:09.973159 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:25:10 crc kubenswrapper[4760]: I1204 13:25:10.145121 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:10 crc kubenswrapper[4760]: I1204 13:25:10.145162 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:10 crc kubenswrapper[4760]: I1204 13:25:10.196674 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:11 crc kubenswrapper[4760]: I1204 13:25:11.051388 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hw64h" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="registry-server" containerID="cri-o://e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226" gracePeriod=2 Dec 04 13:25:11 crc kubenswrapper[4760]: I1204 13:25:11.119643 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:11 crc kubenswrapper[4760]: I1204 13:25:11.812285 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.016113 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content\") pod \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.016840 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities\") pod \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.017228 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc4jd\" (UniqueName: \"kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd\") pod \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\" (UID: \"60d43e71-fbbe-445b-8ea5-98708ba91a8d\") " Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.026779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities" (OuterVolumeSpecName: "utilities") pod "60d43e71-fbbe-445b-8ea5-98708ba91a8d" (UID: "60d43e71-fbbe-445b-8ea5-98708ba91a8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.040642 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd" (OuterVolumeSpecName: "kube-api-access-rc4jd") pod "60d43e71-fbbe-445b-8ea5-98708ba91a8d" (UID: "60d43e71-fbbe-445b-8ea5-98708ba91a8d"). InnerVolumeSpecName "kube-api-access-rc4jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.063810 4760 generic.go:334] "Generic (PLEG): container finished" podID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerID="e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226" exitCode=0 Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.064330 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerDied","Data":"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226"} Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.064372 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hw64h" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.064388 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hw64h" event={"ID":"60d43e71-fbbe-445b-8ea5-98708ba91a8d","Type":"ContainerDied","Data":"2b08ed69917f6af6ca6ca396c6f277afb29eeda8b4f8719853063f3d3c93c631"} Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.064411 4760 scope.go:117] "RemoveContainer" containerID="e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.112306 4760 scope.go:117] "RemoveContainer" containerID="007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.120957 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc4jd\" (UniqueName: \"kubernetes.io/projected/60d43e71-fbbe-445b-8ea5-98708ba91a8d-kube-api-access-rc4jd\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.120988 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.139737 4760 scope.go:117] "RemoveContainer" containerID="968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.185855 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60d43e71-fbbe-445b-8ea5-98708ba91a8d" (UID: "60d43e71-fbbe-445b-8ea5-98708ba91a8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.195017 4760 scope.go:117] "RemoveContainer" containerID="e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226" Dec 04 13:25:12 crc kubenswrapper[4760]: E1204 13:25:12.195924 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226\": container with ID starting with e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226 not found: ID does not exist" containerID="e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.195971 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226"} err="failed to get container status \"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226\": rpc error: code = NotFound desc = could not find container \"e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226\": container with ID starting with e18e8a98bbcde0b101aac71ac3aa4b0fc342b944b17c734c10a9cd0102ee3226 not found: ID does not exist" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.196004 4760 scope.go:117] "RemoveContainer" containerID="007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c" Dec 04 13:25:12 crc kubenswrapper[4760]: E1204 13:25:12.199878 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c\": container with ID starting with 007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c not found: ID does not exist" containerID="007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.199931 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c"} err="failed to get container status \"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c\": rpc error: code = NotFound desc = could not find container \"007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c\": container with ID starting with 007956f45992342e4211ea31f86f4660c6122d3cba05388b96fa544b3dbb806c not found: ID does not exist" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.199974 4760 scope.go:117] "RemoveContainer" containerID="968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124" Dec 04 13:25:12 crc kubenswrapper[4760]: E1204 13:25:12.201302 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124\": container with ID starting with 968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124 not found: ID does not exist" containerID="968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.201334 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124"} err="failed to get container status \"968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124\": rpc error: code = NotFound desc = could not find container \"968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124\": container with ID starting with 968e68137933224227631317971aaab4f74779d21972474435c4c0428a5f6124 not found: ID does not exist" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.222839 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d43e71-fbbe-445b-8ea5-98708ba91a8d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.402350 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.413305 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hw64h"] Dec 04 13:25:12 crc kubenswrapper[4760]: I1204 13:25:12.571480 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.075740 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5drhf" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="registry-server" containerID="cri-o://1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb" gracePeriod=2 Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.785065 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.878468 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" path="/var/lib/kubelet/pods/60d43e71-fbbe-445b-8ea5-98708ba91a8d/volumes" Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.959177 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27z2f\" (UniqueName: \"kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f\") pod \"d26f55d3-5077-4250-bc46-28497736e960\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.959317 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities\") pod \"d26f55d3-5077-4250-bc46-28497736e960\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.959497 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content\") pod \"d26f55d3-5077-4250-bc46-28497736e960\" (UID: \"d26f55d3-5077-4250-bc46-28497736e960\") " Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.961443 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities" (OuterVolumeSpecName: "utilities") pod "d26f55d3-5077-4250-bc46-28497736e960" (UID: "d26f55d3-5077-4250-bc46-28497736e960"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.968581 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f" (OuterVolumeSpecName: "kube-api-access-27z2f") pod "d26f55d3-5077-4250-bc46-28497736e960" (UID: "d26f55d3-5077-4250-bc46-28497736e960"). InnerVolumeSpecName "kube-api-access-27z2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:25:13 crc kubenswrapper[4760]: I1204 13:25:13.980810 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d26f55d3-5077-4250-bc46-28497736e960" (UID: "d26f55d3-5077-4250-bc46-28497736e960"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.062486 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.062523 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26f55d3-5077-4250-bc46-28497736e960-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.062966 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27z2f\" (UniqueName: \"kubernetes.io/projected/d26f55d3-5077-4250-bc46-28497736e960-kube-api-access-27z2f\") on node \"crc\" DevicePath \"\"" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.088077 4760 generic.go:334] "Generic (PLEG): container finished" podID="d26f55d3-5077-4250-bc46-28497736e960" containerID="1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb" exitCode=0 Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.088138 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerDied","Data":"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb"} Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.088173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5drhf" event={"ID":"d26f55d3-5077-4250-bc46-28497736e960","Type":"ContainerDied","Data":"65c31c60c8932da4f2b3d753efcf17d8b77e1c6b2f98e494ca5a382a72bb7b04"} Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.088195 4760 scope.go:117] "RemoveContainer" containerID="1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.088229 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5drhf" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.111524 4760 scope.go:117] "RemoveContainer" containerID="39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.131884 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.142815 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5drhf"] Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.148542 4760 scope.go:117] "RemoveContainer" containerID="da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.195998 4760 scope.go:117] "RemoveContainer" containerID="1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb" Dec 04 13:25:14 crc kubenswrapper[4760]: E1204 13:25:14.196753 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb\": container with ID starting with 1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb not found: ID does not exist" containerID="1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.196810 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb"} err="failed to get container status \"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb\": rpc error: code = NotFound desc = could not find container \"1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb\": container with ID starting with 1f45dee65d5f8c4ceac84e1cf329ae931200451795b43f531eed6cde804139eb not found: ID does not exist" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.196843 4760 scope.go:117] "RemoveContainer" containerID="39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20" Dec 04 13:25:14 crc kubenswrapper[4760]: E1204 13:25:14.197299 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20\": container with ID starting with 39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20 not found: ID does not exist" containerID="39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.197345 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20"} err="failed to get container status \"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20\": rpc error: code = NotFound desc = could not find container \"39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20\": container with ID starting with 39d1d9b18c367e240a29709dd7964c341f5156156a005081058e9797e4746c20 not found: ID does not exist" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.197373 4760 scope.go:117] "RemoveContainer" containerID="da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671" Dec 04 13:25:14 crc kubenswrapper[4760]: E1204 13:25:14.197693 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671\": container with ID starting with da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671 not found: ID does not exist" containerID="da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671" Dec 04 13:25:14 crc kubenswrapper[4760]: I1204 13:25:14.197720 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671"} err="failed to get container status \"da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671\": rpc error: code = NotFound desc = could not find container \"da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671\": container with ID starting with da27357db8ad2adff18bd8a07403719892e4c55dd0e1c87c13aa35181b86c671 not found: ID does not exist" Dec 04 13:25:15 crc kubenswrapper[4760]: I1204 13:25:15.881763 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26f55d3-5077-4250-bc46-28497736e960" path="/var/lib/kubelet/pods/d26f55d3-5077-4250-bc46-28497736e960/volumes" Dec 04 13:25:33 crc kubenswrapper[4760]: I1204 13:25:33.380810 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:25:33 crc kubenswrapper[4760]: I1204 13:25:33.381439 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:26:03 crc kubenswrapper[4760]: I1204 13:26:03.380774 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:26:03 crc kubenswrapper[4760]: I1204 13:26:03.381361 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:26:33 crc kubenswrapper[4760]: I1204 13:26:33.380470 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:26:33 crc kubenswrapper[4760]: I1204 13:26:33.381929 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:26:33 crc kubenswrapper[4760]: I1204 13:26:33.382050 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:26:33 crc kubenswrapper[4760]: I1204 13:26:33.383032 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:26:33 crc kubenswrapper[4760]: I1204 13:26:33.383173 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" gracePeriod=600 Dec 04 13:26:33 crc kubenswrapper[4760]: E1204 13:26:33.503524 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:26:34 crc kubenswrapper[4760]: I1204 13:26:34.498129 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" exitCode=0 Dec 04 13:26:34 crc kubenswrapper[4760]: I1204 13:26:34.498228 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337"} Dec 04 13:26:34 crc kubenswrapper[4760]: I1204 13:26:34.498522 4760 scope.go:117] "RemoveContainer" containerID="d244835dffbd5b251ea1aa599bc2f6785676174e33bb5c4acba8280fd11ec1b9" Dec 04 13:26:34 crc kubenswrapper[4760]: I1204 13:26:34.499296 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:26:34 crc kubenswrapper[4760]: E1204 13:26:34.499638 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:26:46 crc kubenswrapper[4760]: I1204 13:26:46.865184 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:26:46 crc kubenswrapper[4760]: E1204 13:26:46.866103 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:26:58 crc kubenswrapper[4760]: I1204 13:26:58.865050 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:26:58 crc kubenswrapper[4760]: E1204 13:26:58.868694 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:27:09 crc kubenswrapper[4760]: I1204 13:27:09.864489 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:27:09 crc kubenswrapper[4760]: E1204 13:27:09.865309 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:27:24 crc kubenswrapper[4760]: I1204 13:27:24.864567 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:27:24 crc kubenswrapper[4760]: E1204 13:27:24.865446 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:27:36 crc kubenswrapper[4760]: I1204 13:27:36.864931 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:27:36 crc kubenswrapper[4760]: E1204 13:27:36.865782 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:27:50 crc kubenswrapper[4760]: I1204 13:27:50.865037 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:27:50 crc kubenswrapper[4760]: E1204 13:27:50.865861 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:28:04 crc kubenswrapper[4760]: I1204 13:28:04.864415 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:28:04 crc kubenswrapper[4760]: E1204 13:28:04.865175 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:28:16 crc kubenswrapper[4760]: I1204 13:28:16.864250 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:28:16 crc kubenswrapper[4760]: E1204 13:28:16.865077 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:28:30 crc kubenswrapper[4760]: I1204 13:28:30.865529 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:28:30 crc kubenswrapper[4760]: E1204 13:28:30.866554 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:28:41 crc kubenswrapper[4760]: I1204 13:28:41.865144 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:28:41 crc kubenswrapper[4760]: E1204 13:28:41.866095 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:28:55 crc kubenswrapper[4760]: I1204 13:28:55.864992 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:28:55 crc kubenswrapper[4760]: E1204 13:28:55.865836 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:29:09 crc kubenswrapper[4760]: I1204 13:29:09.871773 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:29:09 crc kubenswrapper[4760]: E1204 13:29:09.872560 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:29:21 crc kubenswrapper[4760]: I1204 13:29:21.864179 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:29:21 crc kubenswrapper[4760]: E1204 13:29:21.865258 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:29:34 crc kubenswrapper[4760]: I1204 13:29:34.865106 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:29:34 crc kubenswrapper[4760]: E1204 13:29:34.866868 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:29:45 crc kubenswrapper[4760]: I1204 13:29:45.864095 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:29:45 crc kubenswrapper[4760]: E1204 13:29:45.864999 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:29:58 crc kubenswrapper[4760]: I1204 13:29:58.864899 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:29:58 crc kubenswrapper[4760]: E1204 13:29:58.865769 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.209553 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv"] Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210395 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="extract-utilities" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210415 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="extract-utilities" Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210443 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="extract-utilities" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210449 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="extract-utilities" Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210463 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="extract-content" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210469 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="extract-content" Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210482 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="extract-content" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210488 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="extract-content" Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210503 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210509 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: E1204 13:30:00.210525 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210531 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210753 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d43e71-fbbe-445b-8ea5-98708ba91a8d" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.210874 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26f55d3-5077-4250-bc46-28497736e960" containerName="registry-server" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.211697 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.215924 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.218826 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.222494 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv"] Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.283868 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpsqq\" (UniqueName: \"kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.284037 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.284133 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.386312 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.386422 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.386677 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpsqq\" (UniqueName: \"kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.387713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.393249 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.406843 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpsqq\" (UniqueName: \"kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq\") pod \"collect-profiles-29414250-pcnmv\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:00 crc kubenswrapper[4760]: I1204 13:30:00.540116 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:01 crc kubenswrapper[4760]: I1204 13:30:01.082383 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv"] Dec 04 13:30:01 crc kubenswrapper[4760]: I1204 13:30:01.689854 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" event={"ID":"1bbe9c38-a45a-44cd-810a-6ee88a66d006","Type":"ContainerStarted","Data":"d8d8e27bbc3818677be39c0cf4dd63cb78f4f9617818b357db8df66395306487"} Dec 04 13:30:01 crc kubenswrapper[4760]: I1204 13:30:01.690459 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" event={"ID":"1bbe9c38-a45a-44cd-810a-6ee88a66d006","Type":"ContainerStarted","Data":"7c0477d5c7566312893e7b92eba81c15e7d6a0ece9920fdc79e765c20e264d14"} Dec 04 13:30:01 crc kubenswrapper[4760]: I1204 13:30:01.720626 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" podStartSLOduration=1.720604506 podStartE2EDuration="1.720604506s" podCreationTimestamp="2025-12-04 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 13:30:01.711535498 +0000 UTC m=+4604.752982065" watchObservedRunningTime="2025-12-04 13:30:01.720604506 +0000 UTC m=+4604.762051073" Dec 04 13:30:02 crc kubenswrapper[4760]: I1204 13:30:02.701633 4760 generic.go:334] "Generic (PLEG): container finished" podID="1bbe9c38-a45a-44cd-810a-6ee88a66d006" containerID="d8d8e27bbc3818677be39c0cf4dd63cb78f4f9617818b357db8df66395306487" exitCode=0 Dec 04 13:30:02 crc kubenswrapper[4760]: I1204 13:30:02.702234 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" event={"ID":"1bbe9c38-a45a-44cd-810a-6ee88a66d006","Type":"ContainerDied","Data":"d8d8e27bbc3818677be39c0cf4dd63cb78f4f9617818b357db8df66395306487"} Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.502476 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.583847 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume\") pod \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.583913 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpsqq\" (UniqueName: \"kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq\") pod \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.584018 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume\") pod \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\" (UID: \"1bbe9c38-a45a-44cd-810a-6ee88a66d006\") " Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.585335 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bbe9c38-a45a-44cd-810a-6ee88a66d006" (UID: "1bbe9c38-a45a-44cd-810a-6ee88a66d006"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.592845 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bbe9c38-a45a-44cd-810a-6ee88a66d006" (UID: "1bbe9c38-a45a-44cd-810a-6ee88a66d006"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.592924 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq" (OuterVolumeSpecName: "kube-api-access-xpsqq") pod "1bbe9c38-a45a-44cd-810a-6ee88a66d006" (UID: "1bbe9c38-a45a-44cd-810a-6ee88a66d006"). InnerVolumeSpecName "kube-api-access-xpsqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.686114 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bbe9c38-a45a-44cd-810a-6ee88a66d006-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.686149 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpsqq\" (UniqueName: \"kubernetes.io/projected/1bbe9c38-a45a-44cd-810a-6ee88a66d006-kube-api-access-xpsqq\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.686158 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bbe9c38-a45a-44cd-810a-6ee88a66d006-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.721122 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" event={"ID":"1bbe9c38-a45a-44cd-810a-6ee88a66d006","Type":"ContainerDied","Data":"7c0477d5c7566312893e7b92eba81c15e7d6a0ece9920fdc79e765c20e264d14"} Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.721160 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c0477d5c7566312893e7b92eba81c15e7d6a0ece9920fdc79e765c20e264d14" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.721243 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414250-pcnmv" Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.803674 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8"] Dec 04 13:30:04 crc kubenswrapper[4760]: I1204 13:30:04.811896 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414205-2qmf8"] Dec 04 13:30:05 crc kubenswrapper[4760]: I1204 13:30:05.875913 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c36c4e4-823f-4293-be73-174340e8074f" path="/var/lib/kubelet/pods/7c36c4e4-823f-4293-be73-174340e8074f/volumes" Dec 04 13:30:12 crc kubenswrapper[4760]: I1204 13:30:12.865380 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:30:12 crc kubenswrapper[4760]: E1204 13:30:12.866196 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.268494 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:13 crc kubenswrapper[4760]: E1204 13:30:13.268967 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbe9c38-a45a-44cd-810a-6ee88a66d006" containerName="collect-profiles" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.268983 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbe9c38-a45a-44cd-810a-6ee88a66d006" containerName="collect-profiles" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.269203 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbe9c38-a45a-44cd-810a-6ee88a66d006" containerName="collect-profiles" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.270738 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.293931 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.375644 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.376126 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p57h\" (UniqueName: \"kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.376259 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.478108 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.478471 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p57h\" (UniqueName: \"kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.478676 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.478765 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.479162 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.502010 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p57h\" (UniqueName: \"kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h\") pod \"certified-operators-dj2pn\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:13 crc kubenswrapper[4760]: I1204 13:30:13.617531 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:14 crc kubenswrapper[4760]: I1204 13:30:14.146363 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:14 crc kubenswrapper[4760]: I1204 13:30:14.846781 4760 generic.go:334] "Generic (PLEG): container finished" podID="185046f1-711c-4fd1-983b-9057cfdb108c" containerID="5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b" exitCode=0 Dec 04 13:30:14 crc kubenswrapper[4760]: I1204 13:30:14.846834 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerDied","Data":"5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b"} Dec 04 13:30:14 crc kubenswrapper[4760]: I1204 13:30:14.847609 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerStarted","Data":"09df61b350d57747a399f5962b4de355b86e31276a75abea3e89aec91bcf7568"} Dec 04 13:30:14 crc kubenswrapper[4760]: I1204 13:30:14.849329 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:30:15 crc kubenswrapper[4760]: I1204 13:30:15.916741 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerStarted","Data":"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2"} Dec 04 13:30:17 crc kubenswrapper[4760]: I1204 13:30:17.934243 4760 generic.go:334] "Generic (PLEG): container finished" podID="185046f1-711c-4fd1-983b-9057cfdb108c" containerID="0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2" exitCode=0 Dec 04 13:30:17 crc kubenswrapper[4760]: I1204 13:30:17.934672 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerDied","Data":"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2"} Dec 04 13:30:18 crc kubenswrapper[4760]: I1204 13:30:18.946467 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerStarted","Data":"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976"} Dec 04 13:30:18 crc kubenswrapper[4760]: I1204 13:30:18.979546 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dj2pn" podStartSLOduration=2.455694667 podStartE2EDuration="5.97952183s" podCreationTimestamp="2025-12-04 13:30:13 +0000 UTC" firstStartedPulling="2025-12-04 13:30:14.849034081 +0000 UTC m=+4617.890480648" lastFinishedPulling="2025-12-04 13:30:18.372861244 +0000 UTC m=+4621.414307811" observedRunningTime="2025-12-04 13:30:18.966952951 +0000 UTC m=+4622.008399528" watchObservedRunningTime="2025-12-04 13:30:18.97952183 +0000 UTC m=+4622.020968397" Dec 04 13:30:23 crc kubenswrapper[4760]: I1204 13:30:23.618644 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:23 crc kubenswrapper[4760]: I1204 13:30:23.619265 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:24 crc kubenswrapper[4760]: I1204 13:30:24.409510 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:24 crc kubenswrapper[4760]: I1204 13:30:24.473901 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:24 crc kubenswrapper[4760]: I1204 13:30:24.655900 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:24 crc kubenswrapper[4760]: I1204 13:30:24.865978 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:30:24 crc kubenswrapper[4760]: E1204 13:30:24.869420 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:30:26 crc kubenswrapper[4760]: I1204 13:30:26.178394 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dj2pn" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="registry-server" containerID="cri-o://e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976" gracePeriod=2 Dec 04 13:30:26 crc kubenswrapper[4760]: I1204 13:30:26.976684 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.095409 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content\") pod \"185046f1-711c-4fd1-983b-9057cfdb108c\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.095678 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p57h\" (UniqueName: \"kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h\") pod \"185046f1-711c-4fd1-983b-9057cfdb108c\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.096875 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities\") pod \"185046f1-711c-4fd1-983b-9057cfdb108c\" (UID: \"185046f1-711c-4fd1-983b-9057cfdb108c\") " Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.098022 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities" (OuterVolumeSpecName: "utilities") pod "185046f1-711c-4fd1-983b-9057cfdb108c" (UID: "185046f1-711c-4fd1-983b-9057cfdb108c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.102564 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h" (OuterVolumeSpecName: "kube-api-access-4p57h") pod "185046f1-711c-4fd1-983b-9057cfdb108c" (UID: "185046f1-711c-4fd1-983b-9057cfdb108c"). InnerVolumeSpecName "kube-api-access-4p57h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.147721 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "185046f1-711c-4fd1-983b-9057cfdb108c" (UID: "185046f1-711c-4fd1-983b-9057cfdb108c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.188997 4760 generic.go:334] "Generic (PLEG): container finished" podID="185046f1-711c-4fd1-983b-9057cfdb108c" containerID="e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976" exitCode=0 Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.189048 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerDied","Data":"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976"} Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.189073 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj2pn" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.189102 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj2pn" event={"ID":"185046f1-711c-4fd1-983b-9057cfdb108c","Type":"ContainerDied","Data":"09df61b350d57747a399f5962b4de355b86e31276a75abea3e89aec91bcf7568"} Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.189122 4760 scope.go:117] "RemoveContainer" containerID="e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.200351 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p57h\" (UniqueName: \"kubernetes.io/projected/185046f1-711c-4fd1-983b-9057cfdb108c-kube-api-access-4p57h\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.200380 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.200397 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/185046f1-711c-4fd1-983b-9057cfdb108c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.212920 4760 scope.go:117] "RemoveContainer" containerID="0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.229766 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.240176 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dj2pn"] Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.240686 4760 scope.go:117] "RemoveContainer" containerID="5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.291690 4760 scope.go:117] "RemoveContainer" containerID="e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976" Dec 04 13:30:27 crc kubenswrapper[4760]: E1204 13:30:27.295704 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976\": container with ID starting with e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976 not found: ID does not exist" containerID="e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.295743 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976"} err="failed to get container status \"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976\": rpc error: code = NotFound desc = could not find container \"e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976\": container with ID starting with e144ae5d35f203af097a447231ef749c411591228c6daeea1b925850b7b1b976 not found: ID does not exist" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.295772 4760 scope.go:117] "RemoveContainer" containerID="0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2" Dec 04 13:30:27 crc kubenswrapper[4760]: E1204 13:30:27.296508 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2\": container with ID starting with 0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2 not found: ID does not exist" containerID="0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.296539 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2"} err="failed to get container status \"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2\": rpc error: code = NotFound desc = could not find container \"0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2\": container with ID starting with 0dc1f8eecbfa02dc04f3047bd83dd749d4aa4f4c10922d5a48f6626ef13655f2 not found: ID does not exist" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.296554 4760 scope.go:117] "RemoveContainer" containerID="5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b" Dec 04 13:30:27 crc kubenswrapper[4760]: E1204 13:30:27.296753 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b\": container with ID starting with 5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b not found: ID does not exist" containerID="5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.296774 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b"} err="failed to get container status \"5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b\": rpc error: code = NotFound desc = could not find container \"5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b\": container with ID starting with 5f0b0e4a7860a4b56b93ebc7c35669babba8299d9a5d974779e009c0e8b3c23b not found: ID does not exist" Dec 04 13:30:27 crc kubenswrapper[4760]: I1204 13:30:27.936277 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" path="/var/lib/kubelet/pods/185046f1-711c-4fd1-983b-9057cfdb108c/volumes" Dec 04 13:30:37 crc kubenswrapper[4760]: I1204 13:30:37.871499 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:30:37 crc kubenswrapper[4760]: E1204 13:30:37.872294 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:30:49 crc kubenswrapper[4760]: I1204 13:30:49.870527 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:30:49 crc kubenswrapper[4760]: E1204 13:30:49.877126 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:31:01 crc kubenswrapper[4760]: I1204 13:31:01.721365 4760 scope.go:117] "RemoveContainer" containerID="347187969c055364c9a7088863d1a84eb8c0de102adcab8babd897b07c0c683e" Dec 04 13:31:02 crc kubenswrapper[4760]: I1204 13:31:02.864550 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:31:02 crc kubenswrapper[4760]: E1204 13:31:02.865404 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:31:14 crc kubenswrapper[4760]: I1204 13:31:14.864752 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:31:14 crc kubenswrapper[4760]: E1204 13:31:14.865614 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:31:27 crc kubenswrapper[4760]: I1204 13:31:27.871631 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:31:27 crc kubenswrapper[4760]: E1204 13:31:27.872670 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:31:41 crc kubenswrapper[4760]: I1204 13:31:41.867934 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:31:43 crc kubenswrapper[4760]: I1204 13:31:43.019822 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6"} Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.004650 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:43 crc kubenswrapper[4760]: E1204 13:33:43.005676 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="extract-utilities" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.005695 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="extract-utilities" Dec 04 13:33:43 crc kubenswrapper[4760]: E1204 13:33:43.005726 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="registry-server" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.005732 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="registry-server" Dec 04 13:33:43 crc kubenswrapper[4760]: E1204 13:33:43.005747 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="extract-content" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.005753 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="extract-content" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.005999 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="185046f1-711c-4fd1-983b-9057cfdb108c" containerName="registry-server" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.007787 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.025618 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.035071 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.035249 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.035412 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfzh\" (UniqueName: \"kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.137637 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.137787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfzh\" (UniqueName: \"kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.138087 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.138643 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.138882 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.475164 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfzh\" (UniqueName: \"kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh\") pod \"community-operators-89f5v\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:43 crc kubenswrapper[4760]: I1204 13:33:43.648911 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:44 crc kubenswrapper[4760]: I1204 13:33:44.129565 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:44 crc kubenswrapper[4760]: I1204 13:33:44.757257 4760 generic.go:334] "Generic (PLEG): container finished" podID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerID="f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c" exitCode=0 Dec 04 13:33:44 crc kubenswrapper[4760]: I1204 13:33:44.757360 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerDied","Data":"f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c"} Dec 04 13:33:44 crc kubenswrapper[4760]: I1204 13:33:44.757612 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerStarted","Data":"ba10188f8da4431d8b86dbafc306cc1e6972a5ba0200f75b6283a72cee7de025"} Dec 04 13:33:45 crc kubenswrapper[4760]: I1204 13:33:45.767971 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerStarted","Data":"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d"} Dec 04 13:33:46 crc kubenswrapper[4760]: I1204 13:33:46.778735 4760 generic.go:334] "Generic (PLEG): container finished" podID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerID="5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d" exitCode=0 Dec 04 13:33:46 crc kubenswrapper[4760]: I1204 13:33:46.778979 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerDied","Data":"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d"} Dec 04 13:33:47 crc kubenswrapper[4760]: I1204 13:33:47.791346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerStarted","Data":"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748"} Dec 04 13:33:47 crc kubenswrapper[4760]: I1204 13:33:47.825275 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89f5v" podStartSLOduration=3.3198164070000002 podStartE2EDuration="5.825252607s" podCreationTimestamp="2025-12-04 13:33:42 +0000 UTC" firstStartedPulling="2025-12-04 13:33:44.759015992 +0000 UTC m=+4827.800462559" lastFinishedPulling="2025-12-04 13:33:47.264452192 +0000 UTC m=+4830.305898759" observedRunningTime="2025-12-04 13:33:47.816903973 +0000 UTC m=+4830.858350550" watchObservedRunningTime="2025-12-04 13:33:47.825252607 +0000 UTC m=+4830.866699174" Dec 04 13:33:53 crc kubenswrapper[4760]: I1204 13:33:53.649158 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:53 crc kubenswrapper[4760]: I1204 13:33:53.649778 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:53 crc kubenswrapper[4760]: I1204 13:33:53.701781 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:53 crc kubenswrapper[4760]: I1204 13:33:53.901780 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:53 crc kubenswrapper[4760]: I1204 13:33:53.953740 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:55 crc kubenswrapper[4760]: I1204 13:33:55.865473 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89f5v" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="registry-server" containerID="cri-o://ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748" gracePeriod=2 Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.577165 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.716152 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xfzh\" (UniqueName: \"kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh\") pod \"ccf94c72-3835-41eb-8566-aebe4a0384b1\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.716710 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities\") pod \"ccf94c72-3835-41eb-8566-aebe4a0384b1\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.716832 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content\") pod \"ccf94c72-3835-41eb-8566-aebe4a0384b1\" (UID: \"ccf94c72-3835-41eb-8566-aebe4a0384b1\") " Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.717652 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities" (OuterVolumeSpecName: "utilities") pod "ccf94c72-3835-41eb-8566-aebe4a0384b1" (UID: "ccf94c72-3835-41eb-8566-aebe4a0384b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.724708 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh" (OuterVolumeSpecName: "kube-api-access-5xfzh") pod "ccf94c72-3835-41eb-8566-aebe4a0384b1" (UID: "ccf94c72-3835-41eb-8566-aebe4a0384b1"). InnerVolumeSpecName "kube-api-access-5xfzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.766478 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccf94c72-3835-41eb-8566-aebe4a0384b1" (UID: "ccf94c72-3835-41eb-8566-aebe4a0384b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.819517 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xfzh\" (UniqueName: \"kubernetes.io/projected/ccf94c72-3835-41eb-8566-aebe4a0384b1-kube-api-access-5xfzh\") on node \"crc\" DevicePath \"\"" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.819557 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.819567 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf94c72-3835-41eb-8566-aebe4a0384b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.877948 4760 generic.go:334] "Generic (PLEG): container finished" podID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerID="ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748" exitCode=0 Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.878009 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89f5v" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.878010 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerDied","Data":"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748"} Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.878130 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89f5v" event={"ID":"ccf94c72-3835-41eb-8566-aebe4a0384b1","Type":"ContainerDied","Data":"ba10188f8da4431d8b86dbafc306cc1e6972a5ba0200f75b6283a72cee7de025"} Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.878155 4760 scope.go:117] "RemoveContainer" containerID="ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.906004 4760 scope.go:117] "RemoveContainer" containerID="5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.929268 4760 scope.go:117] "RemoveContainer" containerID="f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c" Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.937488 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:56 crc kubenswrapper[4760]: I1204 13:33:56.965597 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89f5v"] Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.012326 4760 scope.go:117] "RemoveContainer" containerID="ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748" Dec 04 13:33:57 crc kubenswrapper[4760]: E1204 13:33:57.013277 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748\": container with ID starting with ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748 not found: ID does not exist" containerID="ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.013314 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748"} err="failed to get container status \"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748\": rpc error: code = NotFound desc = could not find container \"ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748\": container with ID starting with ba330c2ac9a78f8736f89f8d7bb6a69eaf28e692ab7aef7e5c24933c2b316748 not found: ID does not exist" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.013343 4760 scope.go:117] "RemoveContainer" containerID="5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d" Dec 04 13:33:57 crc kubenswrapper[4760]: E1204 13:33:57.013856 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d\": container with ID starting with 5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d not found: ID does not exist" containerID="5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.013956 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d"} err="failed to get container status \"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d\": rpc error: code = NotFound desc = could not find container \"5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d\": container with ID starting with 5713a1a0f9ff622f19f2c1ff295cc22b39ee85a8e749ac7ae5cafd01e11ec09d not found: ID does not exist" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.013993 4760 scope.go:117] "RemoveContainer" containerID="f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c" Dec 04 13:33:57 crc kubenswrapper[4760]: E1204 13:33:57.014527 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c\": container with ID starting with f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c not found: ID does not exist" containerID="f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.014579 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c"} err="failed to get container status \"f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c\": rpc error: code = NotFound desc = could not find container \"f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c\": container with ID starting with f0e5c021e7a535133347e4663b09e7286ef44fafa2616e713c0be8712e32531c not found: ID does not exist" Dec 04 13:33:57 crc kubenswrapper[4760]: I1204 13:33:57.878548 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" path="/var/lib/kubelet/pods/ccf94c72-3835-41eb-8566-aebe4a0384b1/volumes" Dec 04 13:34:03 crc kubenswrapper[4760]: I1204 13:34:03.380338 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:34:03 crc kubenswrapper[4760]: I1204 13:34:03.380943 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:34:33 crc kubenswrapper[4760]: I1204 13:34:33.380871 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:34:33 crc kubenswrapper[4760]: I1204 13:34:33.381363 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.381157 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.381828 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.381899 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.382988 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.383064 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6" gracePeriod=600 Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.687380 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6" exitCode=0 Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.687491 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6"} Dec 04 13:35:03 crc kubenswrapper[4760]: I1204 13:35:03.687921 4760 scope.go:117] "RemoveContainer" containerID="cc413236d2e99fb4adae1171b70e8c6bfe515ade8b7f780e6f9d6f1007abb337" Dec 04 13:35:04 crc kubenswrapper[4760]: I1204 13:35:04.700727 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f"} Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.013445 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:04 crc kubenswrapper[4760]: E1204 13:36:04.014469 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="extract-utilities" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.014489 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="extract-utilities" Dec 04 13:36:04 crc kubenswrapper[4760]: E1204 13:36:04.014517 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="extract-content" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.014525 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="extract-content" Dec 04 13:36:04 crc kubenswrapper[4760]: E1204 13:36:04.014584 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="registry-server" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.014596 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="registry-server" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.014854 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf94c72-3835-41eb-8566-aebe4a0384b1" containerName="registry-server" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.020993 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.048279 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.199423 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvjn\" (UniqueName: \"kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.199525 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.199567 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.301843 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvjn\" (UniqueName: \"kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.301931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.301966 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.302758 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.302897 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.330248 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvjn\" (UniqueName: \"kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn\") pod \"redhat-operators-64ttr\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.341706 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:04 crc kubenswrapper[4760]: I1204 13:36:04.906710 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:05 crc kubenswrapper[4760]: I1204 13:36:05.323634 4760 generic.go:334] "Generic (PLEG): container finished" podID="68c63277-b97f-4064-9195-f5a2206600de" containerID="0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae" exitCode=0 Dec 04 13:36:05 crc kubenswrapper[4760]: I1204 13:36:05.324128 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerDied","Data":"0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae"} Dec 04 13:36:05 crc kubenswrapper[4760]: I1204 13:36:05.324796 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerStarted","Data":"13951cd73ec27dc1f8e8c7f69befa5c5cd85858f38caebb38d6436e8e3286a1a"} Dec 04 13:36:05 crc kubenswrapper[4760]: I1204 13:36:05.327750 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:36:06 crc kubenswrapper[4760]: I1204 13:36:06.336830 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerStarted","Data":"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb"} Dec 04 13:36:08 crc kubenswrapper[4760]: I1204 13:36:08.355558 4760 generic.go:334] "Generic (PLEG): container finished" podID="68c63277-b97f-4064-9195-f5a2206600de" containerID="1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb" exitCode=0 Dec 04 13:36:08 crc kubenswrapper[4760]: I1204 13:36:08.355677 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerDied","Data":"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb"} Dec 04 13:36:09 crc kubenswrapper[4760]: I1204 13:36:09.366323 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerStarted","Data":"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c"} Dec 04 13:36:09 crc kubenswrapper[4760]: I1204 13:36:09.393616 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-64ttr" podStartSLOduration=2.971835494 podStartE2EDuration="6.393595185s" podCreationTimestamp="2025-12-04 13:36:03 +0000 UTC" firstStartedPulling="2025-12-04 13:36:05.327531576 +0000 UTC m=+4968.368978143" lastFinishedPulling="2025-12-04 13:36:08.749291267 +0000 UTC m=+4971.790737834" observedRunningTime="2025-12-04 13:36:09.384395863 +0000 UTC m=+4972.425842430" watchObservedRunningTime="2025-12-04 13:36:09.393595185 +0000 UTC m=+4972.435041752" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.392047 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.394644 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.405179 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.521789 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.521879 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzktf\" (UniqueName: \"kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.521974 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.623972 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.624660 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzktf\" (UniqueName: \"kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.624800 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.624491 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:11 crc kubenswrapper[4760]: I1204 13:36:11.625536 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:12 crc kubenswrapper[4760]: I1204 13:36:12.068005 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzktf\" (UniqueName: \"kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf\") pod \"redhat-marketplace-xjkww\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:12 crc kubenswrapper[4760]: I1204 13:36:12.331527 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:12 crc kubenswrapper[4760]: I1204 13:36:12.885684 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:13 crc kubenswrapper[4760]: I1204 13:36:13.401183 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerStarted","Data":"304da54c684756549ac37ba3214d72141eff55b13449af2061e1141c967e6efc"} Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.341880 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.342222 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.390696 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.412070 4760 generic.go:334] "Generic (PLEG): container finished" podID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerID="0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98" exitCode=0 Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.412223 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerDied","Data":"0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98"} Dec 04 13:36:14 crc kubenswrapper[4760]: I1204 13:36:14.468644 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:15 crc kubenswrapper[4760]: I1204 13:36:15.422188 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerStarted","Data":"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65"} Dec 04 13:36:16 crc kubenswrapper[4760]: I1204 13:36:16.434824 4760 generic.go:334] "Generic (PLEG): container finished" podID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerID="ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65" exitCode=0 Dec 04 13:36:16 crc kubenswrapper[4760]: I1204 13:36:16.434950 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerDied","Data":"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65"} Dec 04 13:36:16 crc kubenswrapper[4760]: I1204 13:36:16.580066 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:16 crc kubenswrapper[4760]: I1204 13:36:16.580460 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-64ttr" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="registry-server" containerID="cri-o://57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c" gracePeriod=2 Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.248851 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.368013 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvjn\" (UniqueName: \"kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn\") pod \"68c63277-b97f-4064-9195-f5a2206600de\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.369019 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities\") pod \"68c63277-b97f-4064-9195-f5a2206600de\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.369271 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content\") pod \"68c63277-b97f-4064-9195-f5a2206600de\" (UID: \"68c63277-b97f-4064-9195-f5a2206600de\") " Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.370255 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities" (OuterVolumeSpecName: "utilities") pod "68c63277-b97f-4064-9195-f5a2206600de" (UID: "68c63277-b97f-4064-9195-f5a2206600de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.380049 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn" (OuterVolumeSpecName: "kube-api-access-rxvjn") pod "68c63277-b97f-4064-9195-f5a2206600de" (UID: "68c63277-b97f-4064-9195-f5a2206600de"). InnerVolumeSpecName "kube-api-access-rxvjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.691926 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvjn\" (UniqueName: \"kubernetes.io/projected/68c63277-b97f-4064-9195-f5a2206600de-kube-api-access-rxvjn\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.691954 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.702347 4760 generic.go:334] "Generic (PLEG): container finished" podID="68c63277-b97f-4064-9195-f5a2206600de" containerID="57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c" exitCode=0 Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.702405 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerDied","Data":"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c"} Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.702435 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-64ttr" event={"ID":"68c63277-b97f-4064-9195-f5a2206600de","Type":"ContainerDied","Data":"13951cd73ec27dc1f8e8c7f69befa5c5cd85858f38caebb38d6436e8e3286a1a"} Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.702453 4760 scope.go:117] "RemoveContainer" containerID="57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.702548 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-64ttr" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.727937 4760 scope.go:117] "RemoveContainer" containerID="1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.730665 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68c63277-b97f-4064-9195-f5a2206600de" (UID: "68c63277-b97f-4064-9195-f5a2206600de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.772283 4760 scope.go:117] "RemoveContainer" containerID="0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.792917 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c63277-b97f-4064-9195-f5a2206600de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.821901 4760 scope.go:117] "RemoveContainer" containerID="57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c" Dec 04 13:36:17 crc kubenswrapper[4760]: E1204 13:36:17.822580 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c\": container with ID starting with 57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c not found: ID does not exist" containerID="57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.822625 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c"} err="failed to get container status \"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c\": rpc error: code = NotFound desc = could not find container \"57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c\": container with ID starting with 57b539dc3cdd3c542d1953b999f8b7bd9a829aee0ed1b74b4336886309db469c not found: ID does not exist" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.822653 4760 scope.go:117] "RemoveContainer" containerID="1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb" Dec 04 13:36:17 crc kubenswrapper[4760]: E1204 13:36:17.822974 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb\": container with ID starting with 1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb not found: ID does not exist" containerID="1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.823010 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb"} err="failed to get container status \"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb\": rpc error: code = NotFound desc = could not find container \"1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb\": container with ID starting with 1612b0df9e37181ff16238e1001a260506df3b767728e433442ea239803554bb not found: ID does not exist" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.823037 4760 scope.go:117] "RemoveContainer" containerID="0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae" Dec 04 13:36:17 crc kubenswrapper[4760]: E1204 13:36:17.823409 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae\": container with ID starting with 0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae not found: ID does not exist" containerID="0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae" Dec 04 13:36:17 crc kubenswrapper[4760]: I1204 13:36:17.823435 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae"} err="failed to get container status \"0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae\": rpc error: code = NotFound desc = could not find container \"0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae\": container with ID starting with 0615b638c677524fce4f0de3cdbc906b051f9af89c3a440c8d3dd37494dab7ae not found: ID does not exist" Dec 04 13:36:18 crc kubenswrapper[4760]: I1204 13:36:18.026641 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:18 crc kubenswrapper[4760]: I1204 13:36:18.036012 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-64ttr"] Dec 04 13:36:19 crc kubenswrapper[4760]: I1204 13:36:19.875228 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c63277-b97f-4064-9195-f5a2206600de" path="/var/lib/kubelet/pods/68c63277-b97f-4064-9195-f5a2206600de/volumes" Dec 04 13:36:26 crc kubenswrapper[4760]: E1204 13:36:26.743029 4760 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad" Dec 04 13:36:26 crc kubenswrapper[4760]: E1204 13:36:26.743770 4760 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad,Command:[/bin/opm],Args:[serve /extracted-catalog/catalog --cache-dir=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOMEMLIMIT,Value:20MiB,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zzktf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xjkww_openshift-marketplace(276b7bb3-54fb-49c6-bdea-b938d23099f8): ErrImagePull: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Dec 04 13:36:26 crc kubenswrapper[4760]: E1204 13:36:26.744987 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openshift-marketplace/redhat-marketplace-xjkww" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" Dec 04 13:36:26 crc kubenswrapper[4760]: E1204 13:36:26.857049 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"\"" pod="openshift-marketplace/redhat-marketplace-xjkww" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" Dec 04 13:36:42 crc kubenswrapper[4760]: I1204 13:36:42.008485 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerStarted","Data":"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6"} Dec 04 13:36:42 crc kubenswrapper[4760]: I1204 13:36:42.035533 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjkww" podStartSLOduration=3.6985430790000002 podStartE2EDuration="31.035511923s" podCreationTimestamp="2025-12-04 13:36:11 +0000 UTC" firstStartedPulling="2025-12-04 13:36:14.419589214 +0000 UTC m=+4977.461035781" lastFinishedPulling="2025-12-04 13:36:41.756558058 +0000 UTC m=+5004.798004625" observedRunningTime="2025-12-04 13:36:42.030295187 +0000 UTC m=+5005.071741764" watchObservedRunningTime="2025-12-04 13:36:42.035511923 +0000 UTC m=+5005.076958480" Dec 04 13:36:42 crc kubenswrapper[4760]: I1204 13:36:42.332236 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:42 crc kubenswrapper[4760]: I1204 13:36:42.332298 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:43 crc kubenswrapper[4760]: I1204 13:36:43.448293 4760 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xjkww" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="registry-server" probeResult="failure" output=< Dec 04 13:36:43 crc kubenswrapper[4760]: timeout: failed to connect service ":50051" within 1s Dec 04 13:36:43 crc kubenswrapper[4760]: > Dec 04 13:36:52 crc kubenswrapper[4760]: I1204 13:36:52.386242 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:52 crc kubenswrapper[4760]: I1204 13:36:52.443945 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:52 crc kubenswrapper[4760]: I1204 13:36:52.638751 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.118800 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjkww" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="registry-server" containerID="cri-o://f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6" gracePeriod=2 Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.727559 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.853626 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content\") pod \"276b7bb3-54fb-49c6-bdea-b938d23099f8\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.855264 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities\") pod \"276b7bb3-54fb-49c6-bdea-b938d23099f8\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.855408 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzktf\" (UniqueName: \"kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf\") pod \"276b7bb3-54fb-49c6-bdea-b938d23099f8\" (UID: \"276b7bb3-54fb-49c6-bdea-b938d23099f8\") " Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.855979 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities" (OuterVolumeSpecName: "utilities") pod "276b7bb3-54fb-49c6-bdea-b938d23099f8" (UID: "276b7bb3-54fb-49c6-bdea-b938d23099f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.862177 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf" (OuterVolumeSpecName: "kube-api-access-zzktf") pod "276b7bb3-54fb-49c6-bdea-b938d23099f8" (UID: "276b7bb3-54fb-49c6-bdea-b938d23099f8"). InnerVolumeSpecName "kube-api-access-zzktf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.879034 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "276b7bb3-54fb-49c6-bdea-b938d23099f8" (UID: "276b7bb3-54fb-49c6-bdea-b938d23099f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.958283 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.958321 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276b7bb3-54fb-49c6-bdea-b938d23099f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:54 crc kubenswrapper[4760]: I1204 13:36:54.958331 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzktf\" (UniqueName: \"kubernetes.io/projected/276b7bb3-54fb-49c6-bdea-b938d23099f8-kube-api-access-zzktf\") on node \"crc\" DevicePath \"\"" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.134533 4760 generic.go:334] "Generic (PLEG): container finished" podID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerID="f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6" exitCode=0 Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.134867 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerDied","Data":"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6"} Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.134904 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjkww" event={"ID":"276b7bb3-54fb-49c6-bdea-b938d23099f8","Type":"ContainerDied","Data":"304da54c684756549ac37ba3214d72141eff55b13449af2061e1141c967e6efc"} Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.134925 4760 scope.go:117] "RemoveContainer" containerID="f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.135112 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjkww" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.172683 4760 scope.go:117] "RemoveContainer" containerID="ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.185259 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.192148 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjkww"] Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.196481 4760 scope.go:117] "RemoveContainer" containerID="0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.247529 4760 scope.go:117] "RemoveContainer" containerID="f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6" Dec 04 13:36:55 crc kubenswrapper[4760]: E1204 13:36:55.247994 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6\": container with ID starting with f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6 not found: ID does not exist" containerID="f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.248047 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6"} err="failed to get container status \"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6\": rpc error: code = NotFound desc = could not find container \"f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6\": container with ID starting with f36863ad254ff8a2b8be6242fc5ee995d322dbffe01cab434d503242ccbe6aa6 not found: ID does not exist" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.248078 4760 scope.go:117] "RemoveContainer" containerID="ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65" Dec 04 13:36:55 crc kubenswrapper[4760]: E1204 13:36:55.248380 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65\": container with ID starting with ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65 not found: ID does not exist" containerID="ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.248412 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65"} err="failed to get container status \"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65\": rpc error: code = NotFound desc = could not find container \"ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65\": container with ID starting with ed78cd4b11e92bf6fff7f238cd6f1e500e2f507120a81993e83a541e750b3d65 not found: ID does not exist" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.248433 4760 scope.go:117] "RemoveContainer" containerID="0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98" Dec 04 13:36:55 crc kubenswrapper[4760]: E1204 13:36:55.248777 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98\": container with ID starting with 0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98 not found: ID does not exist" containerID="0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.248801 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98"} err="failed to get container status \"0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98\": rpc error: code = NotFound desc = could not find container \"0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98\": container with ID starting with 0a34e615394f6cbef53dced66f3cba363c4d53d509061444f9164975bd1c5c98 not found: ID does not exist" Dec 04 13:36:55 crc kubenswrapper[4760]: I1204 13:36:55.875087 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" path="/var/lib/kubelet/pods/276b7bb3-54fb-49c6-bdea-b938d23099f8/volumes" Dec 04 13:37:03 crc kubenswrapper[4760]: I1204 13:37:03.380607 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:37:03 crc kubenswrapper[4760]: I1204 13:37:03.381075 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:37:33 crc kubenswrapper[4760]: I1204 13:37:33.379897 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:37:33 crc kubenswrapper[4760]: I1204 13:37:33.380424 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.380880 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.381452 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.381516 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.382633 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.382705 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" gracePeriod=600 Dec 04 13:38:03 crc kubenswrapper[4760]: E1204 13:38:03.503183 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.798013 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" exitCode=0 Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.798058 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f"} Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.798114 4760 scope.go:117] "RemoveContainer" containerID="25f1a02c25c6b2bb06c6cde09b72be8f41e0f5c8d5c7eb1c96652ba7cd4679e6" Dec 04 13:38:03 crc kubenswrapper[4760]: I1204 13:38:03.799422 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:38:03 crc kubenswrapper[4760]: E1204 13:38:03.799871 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:38:16 crc kubenswrapper[4760]: I1204 13:38:16.864466 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:38:16 crc kubenswrapper[4760]: E1204 13:38:16.865243 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:38:31 crc kubenswrapper[4760]: I1204 13:38:31.864443 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:38:31 crc kubenswrapper[4760]: E1204 13:38:31.865302 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:38:45 crc kubenswrapper[4760]: I1204 13:38:45.863918 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:38:45 crc kubenswrapper[4760]: E1204 13:38:45.864772 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:38:56 crc kubenswrapper[4760]: I1204 13:38:56.865093 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:38:56 crc kubenswrapper[4760]: E1204 13:38:56.866076 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:39:08 crc kubenswrapper[4760]: I1204 13:39:08.864153 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:39:08 crc kubenswrapper[4760]: E1204 13:39:08.864978 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:39:22 crc kubenswrapper[4760]: I1204 13:39:22.864773 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:39:22 crc kubenswrapper[4760]: E1204 13:39:22.865587 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:39:33 crc kubenswrapper[4760]: I1204 13:39:33.865260 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:39:33 crc kubenswrapper[4760]: E1204 13:39:33.866137 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:39:48 crc kubenswrapper[4760]: I1204 13:39:48.865160 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:39:48 crc kubenswrapper[4760]: E1204 13:39:48.865986 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:03 crc kubenswrapper[4760]: I1204 13:40:03.864868 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:40:03 crc kubenswrapper[4760]: E1204 13:40:03.865637 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:15 crc kubenswrapper[4760]: I1204 13:40:15.865265 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:40:15 crc kubenswrapper[4760]: E1204 13:40:15.866548 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:30 crc kubenswrapper[4760]: I1204 13:40:30.305669 4760 generic.go:334] "Generic (PLEG): container finished" podID="ef91667b-5e29-49a0-9de9-d557462e96c0" containerID="5e7754fcd7bf18a0518f187a1d01439425bfede59fe05a113042b1164f10ca12" exitCode=0 Dec 04 13:40:30 crc kubenswrapper[4760]: I1204 13:40:30.305754 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef91667b-5e29-49a0-9de9-d557462e96c0","Type":"ContainerDied","Data":"5e7754fcd7bf18a0518f187a1d01439425bfede59fe05a113042b1164f10ca12"} Dec 04 13:40:30 crc kubenswrapper[4760]: I1204 13:40:30.864250 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:40:30 crc kubenswrapper[4760]: E1204 13:40:30.864525 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.077334 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236037 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236092 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236152 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236188 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236237 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236283 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236333 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236365 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.236405 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xqj\" (UniqueName: \"kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj\") pod \"ef91667b-5e29-49a0-9de9-d557462e96c0\" (UID: \"ef91667b-5e29-49a0-9de9-d557462e96c0\") " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.238202 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data" (OuterVolumeSpecName: "config-data") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.238975 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.243179 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.259745 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.262769 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj" (OuterVolumeSpecName: "kube-api-access-r5xqj") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "kube-api-access-r5xqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.269146 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.270175 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.300382 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.307693 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ef91667b-5e29-49a0-9de9-d557462e96c0" (UID: "ef91667b-5e29-49a0-9de9-d557462e96c0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.331654 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ef91667b-5e29-49a0-9de9-d557462e96c0","Type":"ContainerDied","Data":"516cafa196dc40be073ba31a51ada137cfbda3b4ffc5cff8f652150ad50777f1"} Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.331706 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="516cafa196dc40be073ba31a51ada137cfbda3b4ffc5cff8f652150ad50777f1" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.331787 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340074 4760 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340125 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340139 4760 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340173 4760 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340186 4760 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340197 4760 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef91667b-5e29-49a0-9de9-d557462e96c0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340207 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef91667b-5e29-49a0-9de9-d557462e96c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340235 4760 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ef91667b-5e29-49a0-9de9-d557462e96c0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.340245 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xqj\" (UniqueName: \"kubernetes.io/projected/ef91667b-5e29-49a0-9de9-d557462e96c0-kube-api-access-r5xqj\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.366489 4760 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 13:40:32 crc kubenswrapper[4760]: I1204 13:40:32.442556 4760 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129032 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129784 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="extract-utilities" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129799 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="extract-utilities" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129819 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef91667b-5e29-49a0-9de9-d557462e96c0" containerName="tempest-tests-tempest-tests-runner" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129825 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef91667b-5e29-49a0-9de9-d557462e96c0" containerName="tempest-tests-tempest-tests-runner" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129841 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129847 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129883 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129891 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129904 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="extract-content" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129910 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="extract-content" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129927 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="extract-utilities" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129932 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="extract-utilities" Dec 04 13:40:33 crc kubenswrapper[4760]: E1204 13:40:33.129944 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="extract-content" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.129949 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="extract-content" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.130128 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c63277-b97f-4064-9195-f5a2206600de" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.130153 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef91667b-5e29-49a0-9de9-d557462e96c0" containerName="tempest-tests-tempest-tests-runner" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.130163 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="276b7bb3-54fb-49c6-bdea-b938d23099f8" containerName="registry-server" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.131655 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.160845 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.262069 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxntr\" (UniqueName: \"kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.262188 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.262245 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.364293 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxntr\" (UniqueName: \"kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.364411 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.364435 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.364979 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.365183 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.400435 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxntr\" (UniqueName: \"kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr\") pod \"certified-operators-bhz8v\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:33 crc kubenswrapper[4760]: I1204 13:40:33.454326 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:34 crc kubenswrapper[4760]: I1204 13:40:34.519827 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:35 crc kubenswrapper[4760]: I1204 13:40:35.363927 4760 generic.go:334] "Generic (PLEG): container finished" podID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerID="af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01" exitCode=0 Dec 04 13:40:35 crc kubenswrapper[4760]: I1204 13:40:35.364049 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerDied","Data":"af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01"} Dec 04 13:40:35 crc kubenswrapper[4760]: I1204 13:40:35.364345 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerStarted","Data":"3f9e7db879c63b8722899835ed6e9501ee3a67ad7841e45ad8cb7bed42574052"} Dec 04 13:40:36 crc kubenswrapper[4760]: I1204 13:40:36.387710 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerStarted","Data":"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd"} Dec 04 13:40:37 crc kubenswrapper[4760]: I1204 13:40:37.403833 4760 generic.go:334] "Generic (PLEG): container finished" podID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerID="53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd" exitCode=0 Dec 04 13:40:37 crc kubenswrapper[4760]: I1204 13:40:37.403947 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerDied","Data":"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd"} Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.487859 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.515046 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.520831 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.666321 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfb4\" (UniqueName: \"kubernetes.io/projected/2526d1e6-dd17-4093-92b2-1bee2a207bac-kube-api-access-djfb4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.666482 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.769097 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.769395 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djfb4\" (UniqueName: \"kubernetes.io/projected/2526d1e6-dd17-4093-92b2-1bee2a207bac-kube-api-access-djfb4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.769637 4760 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.802908 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfb4\" (UniqueName: \"kubernetes.io/projected/2526d1e6-dd17-4093-92b2-1bee2a207bac-kube-api-access-djfb4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.807531 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2526d1e6-dd17-4093-92b2-1bee2a207bac\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:40 crc kubenswrapper[4760]: I1204 13:40:40.845199 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 13:40:41 crc kubenswrapper[4760]: I1204 13:40:41.324043 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 13:40:41 crc kubenswrapper[4760]: I1204 13:40:41.443761 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2526d1e6-dd17-4093-92b2-1bee2a207bac","Type":"ContainerStarted","Data":"ca0e03460ade9cc9b767d19aa00f405dcb29fa22a041594493b5156e2d73a4f2"} Dec 04 13:40:41 crc kubenswrapper[4760]: I1204 13:40:41.865242 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:40:41 crc kubenswrapper[4760]: E1204 13:40:41.866578 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:43 crc kubenswrapper[4760]: I1204 13:40:43.501519 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bhz8v" podStartSLOduration=3.559116139 podStartE2EDuration="10.501486784s" podCreationTimestamp="2025-12-04 13:40:33 +0000 UTC" firstStartedPulling="2025-12-04 13:40:35.365989546 +0000 UTC m=+5238.407436113" lastFinishedPulling="2025-12-04 13:40:42.308360171 +0000 UTC m=+5245.349806758" observedRunningTime="2025-12-04 13:40:43.494815373 +0000 UTC m=+5246.536261950" watchObservedRunningTime="2025-12-04 13:40:43.501486784 +0000 UTC m=+5246.542933381" Dec 04 13:40:44 crc kubenswrapper[4760]: I1204 13:40:44.484921 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerStarted","Data":"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd"} Dec 04 13:40:44 crc kubenswrapper[4760]: I1204 13:40:44.486501 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2526d1e6-dd17-4093-92b2-1bee2a207bac","Type":"ContainerStarted","Data":"fc7bfc8426b78cc015ee30a8626126836e0decbe7803d31bd0da200554bf670e"} Dec 04 13:40:44 crc kubenswrapper[4760]: I1204 13:40:44.505756 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.507309479 podStartE2EDuration="4.505734457s" podCreationTimestamp="2025-12-04 13:40:40 +0000 UTC" firstStartedPulling="2025-12-04 13:40:41.33887596 +0000 UTC m=+5244.380322537" lastFinishedPulling="2025-12-04 13:40:43.337300948 +0000 UTC m=+5246.378747515" observedRunningTime="2025-12-04 13:40:44.499334475 +0000 UTC m=+5247.540781042" watchObservedRunningTime="2025-12-04 13:40:44.505734457 +0000 UTC m=+5247.547181024" Dec 04 13:40:52 crc kubenswrapper[4760]: I1204 13:40:52.864738 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:40:52 crc kubenswrapper[4760]: E1204 13:40:52.865626 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:40:53 crc kubenswrapper[4760]: I1204 13:40:53.454537 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:53 crc kubenswrapper[4760]: I1204 13:40:53.454889 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:53 crc kubenswrapper[4760]: I1204 13:40:53.504718 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:53 crc kubenswrapper[4760]: I1204 13:40:53.679160 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:53 crc kubenswrapper[4760]: I1204 13:40:53.743361 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:55 crc kubenswrapper[4760]: I1204 13:40:55.654721 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bhz8v" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="registry-server" containerID="cri-o://02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd" gracePeriod=2 Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.144748 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.276668 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content\") pod \"f9238366-e87a-44d2-97d5-c1ce038304f5\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.276790 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities\") pod \"f9238366-e87a-44d2-97d5-c1ce038304f5\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.277353 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxntr\" (UniqueName: \"kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr\") pod \"f9238366-e87a-44d2-97d5-c1ce038304f5\" (UID: \"f9238366-e87a-44d2-97d5-c1ce038304f5\") " Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.277779 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities" (OuterVolumeSpecName: "utilities") pod "f9238366-e87a-44d2-97d5-c1ce038304f5" (UID: "f9238366-e87a-44d2-97d5-c1ce038304f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.278086 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.284577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr" (OuterVolumeSpecName: "kube-api-access-zxntr") pod "f9238366-e87a-44d2-97d5-c1ce038304f5" (UID: "f9238366-e87a-44d2-97d5-c1ce038304f5"). InnerVolumeSpecName "kube-api-access-zxntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.339416 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9238366-e87a-44d2-97d5-c1ce038304f5" (UID: "f9238366-e87a-44d2-97d5-c1ce038304f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.380095 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9238366-e87a-44d2-97d5-c1ce038304f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.380145 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxntr\" (UniqueName: \"kubernetes.io/projected/f9238366-e87a-44d2-97d5-c1ce038304f5-kube-api-access-zxntr\") on node \"crc\" DevicePath \"\"" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.664668 4760 generic.go:334] "Generic (PLEG): container finished" podID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerID="02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd" exitCode=0 Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.664720 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bhz8v" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.664749 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerDied","Data":"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd"} Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.666152 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bhz8v" event={"ID":"f9238366-e87a-44d2-97d5-c1ce038304f5","Type":"ContainerDied","Data":"3f9e7db879c63b8722899835ed6e9501ee3a67ad7841e45ad8cb7bed42574052"} Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.666178 4760 scope.go:117] "RemoveContainer" containerID="02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.688550 4760 scope.go:117] "RemoveContainer" containerID="53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.704548 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.714845 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bhz8v"] Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.736470 4760 scope.go:117] "RemoveContainer" containerID="af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.761892 4760 scope.go:117] "RemoveContainer" containerID="02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd" Dec 04 13:40:56 crc kubenswrapper[4760]: E1204 13:40:56.762578 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd\": container with ID starting with 02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd not found: ID does not exist" containerID="02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.762615 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd"} err="failed to get container status \"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd\": rpc error: code = NotFound desc = could not find container \"02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd\": container with ID starting with 02e3cf48c39eb9b6f7d7f8d7b5d269483e9925a3c29425cbf010d2070f0584cd not found: ID does not exist" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.762643 4760 scope.go:117] "RemoveContainer" containerID="53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd" Dec 04 13:40:56 crc kubenswrapper[4760]: E1204 13:40:56.762895 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd\": container with ID starting with 53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd not found: ID does not exist" containerID="53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.762927 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd"} err="failed to get container status \"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd\": rpc error: code = NotFound desc = could not find container \"53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd\": container with ID starting with 53ec0858139c77621e6489c99e716bb63b5f0b4513e22cbeebaccd9f1d16f5bd not found: ID does not exist" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.762952 4760 scope.go:117] "RemoveContainer" containerID="af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01" Dec 04 13:40:56 crc kubenswrapper[4760]: E1204 13:40:56.763154 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01\": container with ID starting with af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01 not found: ID does not exist" containerID="af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01" Dec 04 13:40:56 crc kubenswrapper[4760]: I1204 13:40:56.763182 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01"} err="failed to get container status \"af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01\": rpc error: code = NotFound desc = could not find container \"af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01\": container with ID starting with af5e5dfc0a1b4e8d6675c4c9d14883b7273cc660e1529ac05a3ff230ba1f6a01 not found: ID does not exist" Dec 04 13:40:57 crc kubenswrapper[4760]: I1204 13:40:57.880807 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" path="/var/lib/kubelet/pods/f9238366-e87a-44d2-97d5-c1ce038304f5/volumes" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.864828 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:41:05 crc kubenswrapper[4760]: E1204 13:41:05.865610 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.931343 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tm5jt/must-gather-rwfhl"] Dec 04 13:41:05 crc kubenswrapper[4760]: E1204 13:41:05.932348 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="registry-server" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.932373 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="registry-server" Dec 04 13:41:05 crc kubenswrapper[4760]: E1204 13:41:05.932386 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="extract-utilities" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.932396 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="extract-utilities" Dec 04 13:41:05 crc kubenswrapper[4760]: E1204 13:41:05.932460 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="extract-content" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.932470 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="extract-content" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.932750 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9238366-e87a-44d2-97d5-c1ce038304f5" containerName="registry-server" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.934235 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.941469 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tm5jt"/"default-dockercfg-47x88" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.945401 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tm5jt"/"kube-root-ca.crt" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.945856 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tm5jt"/"openshift-service-ca.crt" Dec 04 13:41:05 crc kubenswrapper[4760]: I1204 13:41:05.946463 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tm5jt/must-gather-rwfhl"] Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.018090 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.018221 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kn\" (UniqueName: \"kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.119787 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.119850 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kn\" (UniqueName: \"kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.120646 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.143802 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kn\" (UniqueName: \"kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn\") pod \"must-gather-rwfhl\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.254974 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.752813 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tm5jt/must-gather-rwfhl"] Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.756389 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:41:06 crc kubenswrapper[4760]: I1204 13:41:06.816439 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" event={"ID":"f52bc647-d752-45f2-a391-2d676657775b","Type":"ContainerStarted","Data":"0f9484aeea60b8db2b8240fe1e92a760050b3f8b8786ecd0ade7b7b005153e14"} Dec 04 13:41:11 crc kubenswrapper[4760]: I1204 13:41:11.900381 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" event={"ID":"f52bc647-d752-45f2-a391-2d676657775b","Type":"ContainerStarted","Data":"4b3f90b5dd6589eb19cf6c140e6095a97bb1a04607b02f1771df6261af15dcba"} Dec 04 13:41:12 crc kubenswrapper[4760]: I1204 13:41:12.907959 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" event={"ID":"f52bc647-d752-45f2-a391-2d676657775b","Type":"ContainerStarted","Data":"889962dedac26d990a7ca1811915317a883c424a22914ba7b74a7d4787cc1555"} Dec 04 13:41:12 crc kubenswrapper[4760]: I1204 13:41:12.945472 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" podStartSLOduration=3.344619613 podStartE2EDuration="7.94543736s" podCreationTimestamp="2025-12-04 13:41:05 +0000 UTC" firstStartedPulling="2025-12-04 13:41:06.756171195 +0000 UTC m=+5269.797617762" lastFinishedPulling="2025-12-04 13:41:11.356988942 +0000 UTC m=+5274.398435509" observedRunningTime="2025-12-04 13:41:12.931289211 +0000 UTC m=+5275.972735798" watchObservedRunningTime="2025-12-04 13:41:12.94543736 +0000 UTC m=+5275.986883927" Dec 04 13:41:16 crc kubenswrapper[4760]: I1204 13:41:16.864390 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:41:16 crc kubenswrapper[4760]: E1204 13:41:16.865187 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:41:17 crc kubenswrapper[4760]: I1204 13:41:17.885370 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-hwg7s"] Dec 04 13:41:17 crc kubenswrapper[4760]: I1204 13:41:17.887815 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:17 crc kubenswrapper[4760]: I1204 13:41:17.921143 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8pmr\" (UniqueName: \"kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:17 crc kubenswrapper[4760]: I1204 13:41:17.921730 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.023805 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8pmr\" (UniqueName: \"kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.023917 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.024104 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.042856 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8pmr\" (UniqueName: \"kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr\") pod \"crc-debug-hwg7s\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.214185 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:41:18 crc kubenswrapper[4760]: I1204 13:41:18.986135 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" event={"ID":"af6afe0a-8bdc-4f42-a707-11efa4c14637","Type":"ContainerStarted","Data":"8d5ec8424fe8200079d62fa36001e7713c473b10e2660c05931c1980d76aa5cf"} Dec 04 13:41:31 crc kubenswrapper[4760]: I1204 13:41:31.117641 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" event={"ID":"af6afe0a-8bdc-4f42-a707-11efa4c14637","Type":"ContainerStarted","Data":"45f5259f59239f0de993b5859e01ae4c095bf63db3bfb3fd87f01b7b872f0488"} Dec 04 13:41:31 crc kubenswrapper[4760]: I1204 13:41:31.151311 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" podStartSLOduration=2.418515785 podStartE2EDuration="14.151289639s" podCreationTimestamp="2025-12-04 13:41:17 +0000 UTC" firstStartedPulling="2025-12-04 13:41:18.253652398 +0000 UTC m=+5281.295098975" lastFinishedPulling="2025-12-04 13:41:29.986426262 +0000 UTC m=+5293.027872829" observedRunningTime="2025-12-04 13:41:31.140460745 +0000 UTC m=+5294.181907312" watchObservedRunningTime="2025-12-04 13:41:31.151289639 +0000 UTC m=+5294.192736196" Dec 04 13:41:31 crc kubenswrapper[4760]: I1204 13:41:31.865008 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:41:31 crc kubenswrapper[4760]: E1204 13:41:31.865796 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:41:43 crc kubenswrapper[4760]: I1204 13:41:43.866700 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:41:43 crc kubenswrapper[4760]: E1204 13:41:43.867603 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:41:55 crc kubenswrapper[4760]: I1204 13:41:55.864713 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:41:55 crc kubenswrapper[4760]: E1204 13:41:55.865479 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:42:10 crc kubenswrapper[4760]: I1204 13:42:10.865003 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:42:10 crc kubenswrapper[4760]: E1204 13:42:10.865832 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:42:23 crc kubenswrapper[4760]: I1204 13:42:23.682631 4760 generic.go:334] "Generic (PLEG): container finished" podID="af6afe0a-8bdc-4f42-a707-11efa4c14637" containerID="45f5259f59239f0de993b5859e01ae4c095bf63db3bfb3fd87f01b7b872f0488" exitCode=0 Dec 04 13:42:23 crc kubenswrapper[4760]: I1204 13:42:23.682738 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" event={"ID":"af6afe0a-8bdc-4f42-a707-11efa4c14637","Type":"ContainerDied","Data":"45f5259f59239f0de993b5859e01ae4c095bf63db3bfb3fd87f01b7b872f0488"} Dec 04 13:42:23 crc kubenswrapper[4760]: I1204 13:42:23.865266 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:42:23 crc kubenswrapper[4760]: E1204 13:42:23.865873 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.806398 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.848962 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-hwg7s"] Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.857873 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-hwg7s"] Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.913382 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8pmr\" (UniqueName: \"kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr\") pod \"af6afe0a-8bdc-4f42-a707-11efa4c14637\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.913692 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host\") pod \"af6afe0a-8bdc-4f42-a707-11efa4c14637\" (UID: \"af6afe0a-8bdc-4f42-a707-11efa4c14637\") " Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.913804 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host" (OuterVolumeSpecName: "host") pod "af6afe0a-8bdc-4f42-a707-11efa4c14637" (UID: "af6afe0a-8bdc-4f42-a707-11efa4c14637"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.914204 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/af6afe0a-8bdc-4f42-a707-11efa4c14637-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:24 crc kubenswrapper[4760]: I1204 13:42:24.919935 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr" (OuterVolumeSpecName: "kube-api-access-f8pmr") pod "af6afe0a-8bdc-4f42-a707-11efa4c14637" (UID: "af6afe0a-8bdc-4f42-a707-11efa4c14637"). InnerVolumeSpecName "kube-api-access-f8pmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:42:25 crc kubenswrapper[4760]: I1204 13:42:25.016094 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8pmr\" (UniqueName: \"kubernetes.io/projected/af6afe0a-8bdc-4f42-a707-11efa4c14637-kube-api-access-f8pmr\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:25 crc kubenswrapper[4760]: I1204 13:42:25.705078 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d5ec8424fe8200079d62fa36001e7713c473b10e2660c05931c1980d76aa5cf" Dec 04 13:42:25 crc kubenswrapper[4760]: I1204 13:42:25.705129 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-hwg7s" Dec 04 13:42:25 crc kubenswrapper[4760]: I1204 13:42:25.878110 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af6afe0a-8bdc-4f42-a707-11efa4c14637" path="/var/lib/kubelet/pods/af6afe0a-8bdc-4f42-a707-11efa4c14637/volumes" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.021267 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-p2kkj"] Dec 04 13:42:26 crc kubenswrapper[4760]: E1204 13:42:26.021834 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af6afe0a-8bdc-4f42-a707-11efa4c14637" containerName="container-00" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.021855 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="af6afe0a-8bdc-4f42-a707-11efa4c14637" containerName="container-00" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.022077 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="af6afe0a-8bdc-4f42-a707-11efa4c14637" containerName="container-00" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.022847 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.037843 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd96z\" (UniqueName: \"kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.039859 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.143671 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd96z\" (UniqueName: \"kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.143815 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.143992 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.163160 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd96z\" (UniqueName: \"kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z\") pod \"crc-debug-p2kkj\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.343284 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:26 crc kubenswrapper[4760]: I1204 13:42:26.715693 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" event={"ID":"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4","Type":"ContainerStarted","Data":"1fd61480c41005c6a7c1407dba541fea305d101326fbf550c85194d9f84899db"} Dec 04 13:42:27 crc kubenswrapper[4760]: I1204 13:42:27.728146 4760 generic.go:334] "Generic (PLEG): container finished" podID="0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" containerID="7f172888436492b34aab61b84d2f8622b96cd2721c16ab0b1b4dc6d640122ffe" exitCode=0 Dec 04 13:42:27 crc kubenswrapper[4760]: I1204 13:42:27.728195 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" event={"ID":"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4","Type":"ContainerDied","Data":"7f172888436492b34aab61b84d2f8622b96cd2721c16ab0b1b4dc6d640122ffe"} Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.425489 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.511651 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd96z\" (UniqueName: \"kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z\") pod \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.511784 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host\") pod \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\" (UID: \"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4\") " Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.512549 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host" (OuterVolumeSpecName: "host") pod "0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" (UID: "0af9adda-6ef2-4b6b-8b43-69ac4fec71d4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.526440 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z" (OuterVolumeSpecName: "kube-api-access-nd96z") pod "0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" (UID: "0af9adda-6ef2-4b6b-8b43-69ac4fec71d4"). InnerVolumeSpecName "kube-api-access-nd96z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.616037 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd96z\" (UniqueName: \"kubernetes.io/projected/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-kube-api-access-nd96z\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.616077 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.750659 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" event={"ID":"0af9adda-6ef2-4b6b-8b43-69ac4fec71d4","Type":"ContainerDied","Data":"1fd61480c41005c6a7c1407dba541fea305d101326fbf550c85194d9f84899db"} Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.750703 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd61480c41005c6a7c1407dba541fea305d101326fbf550c85194d9f84899db" Dec 04 13:42:29 crc kubenswrapper[4760]: I1204 13:42:29.750758 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-p2kkj" Dec 04 13:42:30 crc kubenswrapper[4760]: I1204 13:42:30.289408 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-p2kkj"] Dec 04 13:42:30 crc kubenswrapper[4760]: I1204 13:42:30.298125 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-p2kkj"] Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.488198 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-q5cf4"] Dec 04 13:42:31 crc kubenswrapper[4760]: E1204 13:42:31.489325 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" containerName="container-00" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.489344 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" containerName="container-00" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.489581 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" containerName="container-00" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.490344 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.653964 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.654705 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjkm5\" (UniqueName: \"kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.757277 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.757509 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.757520 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjkm5\" (UniqueName: \"kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.804081 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjkm5\" (UniqueName: \"kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5\") pod \"crc-debug-q5cf4\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.812302 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:31 crc kubenswrapper[4760]: W1204 13:42:31.886391 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6acc562_1ffc_49a6_a659_0ead796f93ca.slice/crio-1ec08f5a08efeccfbd210686c4e885b131fe0d065ab045cec95f31c290128cca WatchSource:0}: Error finding container 1ec08f5a08efeccfbd210686c4e885b131fe0d065ab045cec95f31c290128cca: Status 404 returned error can't find the container with id 1ec08f5a08efeccfbd210686c4e885b131fe0d065ab045cec95f31c290128cca Dec 04 13:42:31 crc kubenswrapper[4760]: I1204 13:42:31.908178 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af9adda-6ef2-4b6b-8b43-69ac4fec71d4" path="/var/lib/kubelet/pods/0af9adda-6ef2-4b6b-8b43-69ac4fec71d4/volumes" Dec 04 13:42:32 crc kubenswrapper[4760]: I1204 13:42:32.780882 4760 generic.go:334] "Generic (PLEG): container finished" podID="b6acc562-1ffc-49a6-a659-0ead796f93ca" containerID="9f3248ff37bef00855871142f59d2cdfeaa77736029e354cd148ef14e0965629" exitCode=0 Dec 04 13:42:32 crc kubenswrapper[4760]: I1204 13:42:32.780972 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" event={"ID":"b6acc562-1ffc-49a6-a659-0ead796f93ca","Type":"ContainerDied","Data":"9f3248ff37bef00855871142f59d2cdfeaa77736029e354cd148ef14e0965629"} Dec 04 13:42:32 crc kubenswrapper[4760]: I1204 13:42:32.781173 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" event={"ID":"b6acc562-1ffc-49a6-a659-0ead796f93ca","Type":"ContainerStarted","Data":"1ec08f5a08efeccfbd210686c4e885b131fe0d065ab045cec95f31c290128cca"} Dec 04 13:42:32 crc kubenswrapper[4760]: I1204 13:42:32.832635 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-q5cf4"] Dec 04 13:42:32 crc kubenswrapper[4760]: I1204 13:42:32.842312 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tm5jt/crc-debug-q5cf4"] Dec 04 13:42:33 crc kubenswrapper[4760]: I1204 13:42:33.969365 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.136480 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host\") pod \"b6acc562-1ffc-49a6-a659-0ead796f93ca\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.136596 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host" (OuterVolumeSpecName: "host") pod "b6acc562-1ffc-49a6-a659-0ead796f93ca" (UID: "b6acc562-1ffc-49a6-a659-0ead796f93ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.136671 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjkm5\" (UniqueName: \"kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5\") pod \"b6acc562-1ffc-49a6-a659-0ead796f93ca\" (UID: \"b6acc562-1ffc-49a6-a659-0ead796f93ca\") " Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.137328 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6acc562-1ffc-49a6-a659-0ead796f93ca-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.142585 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5" (OuterVolumeSpecName: "kube-api-access-fjkm5") pod "b6acc562-1ffc-49a6-a659-0ead796f93ca" (UID: "b6acc562-1ffc-49a6-a659-0ead796f93ca"). InnerVolumeSpecName "kube-api-access-fjkm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.239460 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjkm5\" (UniqueName: \"kubernetes.io/projected/b6acc562-1ffc-49a6-a659-0ead796f93ca-kube-api-access-fjkm5\") on node \"crc\" DevicePath \"\"" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.802974 4760 scope.go:117] "RemoveContainer" containerID="9f3248ff37bef00855871142f59d2cdfeaa77736029e354cd148ef14e0965629" Dec 04 13:42:34 crc kubenswrapper[4760]: I1204 13:42:34.803148 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/crc-debug-q5cf4" Dec 04 13:42:35 crc kubenswrapper[4760]: I1204 13:42:35.877407 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6acc562-1ffc-49a6-a659-0ead796f93ca" path="/var/lib/kubelet/pods/b6acc562-1ffc-49a6-a659-0ead796f93ca/volumes" Dec 04 13:42:36 crc kubenswrapper[4760]: I1204 13:42:36.864791 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:42:36 crc kubenswrapper[4760]: E1204 13:42:36.865174 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:42:51 crc kubenswrapper[4760]: I1204 13:42:51.864628 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:42:51 crc kubenswrapper[4760]: E1204 13:42:51.865406 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.231903 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-658b6c7fb4-vh8cp_ac9e67e5-eea3-4608-bd30-8483225d28d2/barbican-api/0.log" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.282478 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-658b6c7fb4-vh8cp_ac9e67e5-eea3-4608-bd30-8483225d28d2/barbican-api-log/0.log" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.441727 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-795464b486-tr8rf_17dd60e5-2f5d-4f7d-b694-9fa3245dc207/barbican-keystone-listener/0.log" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.634159 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-577684486f-vqq72_e6ee4654-5dd5-4c14-9985-1037a884e4b7/barbican-worker/0.log" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.740288 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-577684486f-vqq72_e6ee4654-5dd5-4c14-9985-1037a884e4b7/barbican-worker-log/0.log" Dec 04 13:42:52 crc kubenswrapper[4760]: I1204 13:42:52.924858 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r_fd552df6-e07d-4042-b4d7-8b154163e633/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.210388 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/ceilometer-central-agent/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.210832 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/ceilometer-notification-agent/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.326800 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/proxy-httpd/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.383247 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-795464b486-tr8rf_17dd60e5-2f5d-4f7d-b694-9fa3245dc207/barbican-keystone-listener-log/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.482961 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/sg-core/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.680184 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_ffee0fcf-4f0c-4471-8b39-1762da661157/ceph/0.log" Dec 04 13:42:53 crc kubenswrapper[4760]: I1204 13:42:53.970239 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd/cinder-api-log/0.log" Dec 04 13:42:54 crc kubenswrapper[4760]: I1204 13:42:54.037702 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd/cinder-api/0.log" Dec 04 13:42:54 crc kubenswrapper[4760]: I1204 13:42:54.244520 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_21d77c4c-3493-44c4-b194-6d9dd912d5a1/probe/0.log" Dec 04 13:42:54 crc kubenswrapper[4760]: I1204 13:42:54.298839 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5bf94526-eda5-4784-a223-e0ff51ec09e8/cinder-scheduler/0.log" Dec 04 13:42:54 crc kubenswrapper[4760]: I1204 13:42:54.591997 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5bf94526-eda5-4784-a223-e0ff51ec09e8/probe/0.log" Dec 04 13:42:54 crc kubenswrapper[4760]: I1204 13:42:54.841674 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e6f33c-9843-4810-9cec-5b7b7525d759/probe/0.log" Dec 04 13:42:55 crc kubenswrapper[4760]: I1204 13:42:55.195339 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9x22f_7d193294-81b1-457c-99d0-9701df78978b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:55 crc kubenswrapper[4760]: I1204 13:42:55.347667 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6_c5a0aee6-7728-4bcf-8361-93bc45069c7f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:55 crc kubenswrapper[4760]: I1204 13:42:55.597884 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/init/0.log" Dec 04 13:42:55 crc kubenswrapper[4760]: I1204 13:42:55.823040 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/init/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.048427 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-89lhj_831952cf-f2b0-482f-bd5e-69dcf19821f9/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.092760 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/dnsmasq-dns/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.254512 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fa001053-23fb-4a80-8f36-8efc97cdc04d/glance-httpd/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.303906 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fa001053-23fb-4a80-8f36-8efc97cdc04d/glance-log/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.591057 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_034481e8-9000-4642-8f09-01e015db2de2/glance-log/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.601350 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_034481e8-9000-4642-8f09-01e015db2de2/glance-httpd/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.901259 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_21d77c4c-3493-44c4-b194-6d9dd912d5a1/cinder-backup/0.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.944985 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon/2.log" Dec 04 13:42:56 crc kubenswrapper[4760]: I1204 13:42:56.951933 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon/1.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.195071 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6z22t_d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.467926 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-b7gjm_38e0514a-720e-4407-9e18-9fff5e901aab/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.492531 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414221-d4ff5_a5b9d885-9739-498a-bd0e-fd78e0d5c779/keystone-cron/0.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.688768 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon-log/0.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.890257 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e6f33c-9843-4810-9cec-5b7b7525d759/cinder-volume/0.log" Dec 04 13:42:57 crc kubenswrapper[4760]: I1204 13:42:57.893176 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2cded8b4-450e-4646-b9c4-9df7334f5532/kube-state-metrics/0.log" Dec 04 13:42:58 crc kubenswrapper[4760]: I1204 13:42:58.096040 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x_e2dcfb70-5791-401e-a7d3-cec6bf1f4dba/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:42:58 crc kubenswrapper[4760]: I1204 13:42:58.546544 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_de257688-7d38-4795-b8a6-36b58bdbc2b8/manila-api/0.log" Dec 04 13:42:58 crc kubenswrapper[4760]: I1204 13:42:58.612827 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da99ecae-3ef9-484d-a420-0317df7654d5/probe/0.log" Dec 04 13:42:58 crc kubenswrapper[4760]: I1204 13:42:58.730557 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da99ecae-3ef9-484d-a420-0317df7654d5/manila-scheduler/0.log" Dec 04 13:42:59 crc kubenswrapper[4760]: I1204 13:42:59.027501 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_546d61e1-ffcb-48a3-8dae-929470ae8372/probe/0.log" Dec 04 13:42:59 crc kubenswrapper[4760]: I1204 13:42:59.370413 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_de257688-7d38-4795-b8a6-36b58bdbc2b8/manila-api-log/0.log" Dec 04 13:42:59 crc kubenswrapper[4760]: I1204 13:42:59.405770 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_546d61e1-ffcb-48a3-8dae-929470ae8372/manila-share/0.log" Dec 04 13:43:00 crc kubenswrapper[4760]: I1204 13:43:00.371698 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6_4e5c739f-fcc7-4384-b7e0-302daee90091/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:00 crc kubenswrapper[4760]: I1204 13:43:00.484782 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869c8d7d5c-srd5v_c05ef76a-b809-4d3d-972c-2e5d2037b806/neutron-httpd/0.log" Dec 04 13:43:01 crc kubenswrapper[4760]: I1204 13:43:01.072774 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869c8d7d5c-srd5v_c05ef76a-b809-4d3d-972c-2e5d2037b806/neutron-api/0.log" Dec 04 13:43:02 crc kubenswrapper[4760]: I1204 13:43:02.030948 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9f10a840-d1c4-4d92-bb37-5abe342cb4d1/nova-cell0-conductor-conductor/0.log" Dec 04 13:43:02 crc kubenswrapper[4760]: I1204 13:43:02.681083 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4e8ed881-2d3f-4939-b065-5b6860ad523d/nova-cell1-conductor-conductor/0.log" Dec 04 13:43:02 crc kubenswrapper[4760]: I1204 13:43:02.865108 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:43:02 crc kubenswrapper[4760]: E1204 13:43:02.865427 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:43:02 crc kubenswrapper[4760]: I1204 13:43:02.997412 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d74ff5d57-x29r6_da835318-50f2-43af-9988-bad83a5ee42c/keystone-api/0.log" Dec 04 13:43:03 crc kubenswrapper[4760]: I1204 13:43:03.356284 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0074865f-e381-4090-a72e-54eec164814e/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 13:43:03 crc kubenswrapper[4760]: I1204 13:43:03.589809 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fx275_b8bba20c-b75b-40da-98ad-436a4d121d13/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:03 crc kubenswrapper[4760]: I1204 13:43:03.593407 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3f4c620c-1c1a-4aa3-aa92-5df1a205e70d/nova-api-log/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.060829 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d7631823-c503-49fd-85ac-ec4b8bc18a5b/nova-metadata-log/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.371569 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3f4c620c-1c1a-4aa3-aa92-5df1a205e70d/nova-api-api/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.501540 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/mysql-bootstrap/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.600433 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_deac79a1-f615-4bba-b00f-11784b824094/nova-scheduler-scheduler/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.703033 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/mysql-bootstrap/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.754510 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/galera/0.log" Dec 04 13:43:04 crc kubenswrapper[4760]: I1204 13:43:04.938849 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/mysql-bootstrap/0.log" Dec 04 13:43:05 crc kubenswrapper[4760]: I1204 13:43:05.167743 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/mysql-bootstrap/0.log" Dec 04 13:43:05 crc kubenswrapper[4760]: I1204 13:43:05.185671 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/galera/0.log" Dec 04 13:43:05 crc kubenswrapper[4760]: I1204 13:43:05.365167 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_edbb46fc-c8ec-45c9-bdb2-36639d92402e/openstackclient/0.log" Dec 04 13:43:05 crc kubenswrapper[4760]: I1204 13:43:05.666189 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-frcmm_52334746-99b6-4056-a7d7-6df95b72d8de/ovn-controller/0.log" Dec 04 13:43:05 crc kubenswrapper[4760]: I1204 13:43:05.829606 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xwlm9_a4174c36-258e-4ed9-b6a7-f52818d3faed/openstack-network-exporter/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.035970 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server-init/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.164793 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d7631823-c503-49fd-85ac-ec4b8bc18a5b/nova-metadata-metadata/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.210423 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server-init/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.267352 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.382582 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovs-vswitchd/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.593126 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fg4xt_aec90918-8692-4e3d-ba94-7b8e358b8f60/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.673800 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa2846c9-8951-497c-bcae-d186f8f62265/openstack-network-exporter/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.710264 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa2846c9-8951-497c-bcae-d186f8f62265/ovn-northd/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.855318 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_591b6997-8c15-499c-8218-e222a178559e/openstack-network-exporter/0.log" Dec 04 13:43:06 crc kubenswrapper[4760]: I1204 13:43:06.944629 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_591b6997-8c15-499c-8218-e222a178559e/ovsdbserver-nb/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.043646 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b055fc8b-2181-441b-b4b3-efa345cfde65/openstack-network-exporter/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.117624 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b055fc8b-2181-441b-b4b3-efa345cfde65/ovsdbserver-sb/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.462538 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/setup-container/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.670863 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/setup-container/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.704120 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/rabbitmq/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.726940 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6999bbbcbb-mwnkr_72a9e917-ec75-4b75-a7db-ca42c3e8d1f5/placement-api/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.858046 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6999bbbcbb-mwnkr_72a9e917-ec75-4b75-a7db-ca42c3e8d1f5/placement-log/0.log" Dec 04 13:43:07 crc kubenswrapper[4760]: I1204 13:43:07.896557 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/setup-container/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.168641 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd_f8e97dd8-8609-4469-a5f8-488c6b3a2098/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.177163 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/rabbitmq/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.189474 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/setup-container/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.388285 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pc5cw_4ba039cb-b160-4ec8-9f00-a42e7bcce289/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.415969 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx_51e288c0-c373-4aa9-9c38-cb94fbeccf01/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.687982 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wj92m_264546ed-074b-4824-8da7-d711ccc821c5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.719877 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8mgzh_b297e018-4fcb-40a4-b5f5-3105c4300ae7/ssh-known-hosts-edpm-deployment/0.log" Dec 04 13:43:08 crc kubenswrapper[4760]: I1204 13:43:08.963807 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f6947989-zvg6b_8b0dacfe-716a-44d3-a653-88fc5183ae97/proxy-server/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.155977 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f6947989-zvg6b_8b0dacfe-716a-44d3-a653-88fc5183ae97/proxy-httpd/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.224894 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m4x4d_4ce3174d-015c-4a85-b58d-af7603479902/swift-ring-rebalance/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.270245 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-auditor/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.462615 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-reaper/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.496703 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-server/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.504782 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-replicator/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.541009 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-auditor/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.721308 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-server/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.757548 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-replicator/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.812281 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-updater/0.log" Dec 04 13:43:09 crc kubenswrapper[4760]: I1204 13:43:09.823254 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-auditor/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.011831 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-expirer/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.051584 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-server/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.084016 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-replicator/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.104928 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-updater/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.764163 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/rsync/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.786874 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/swift-recon-cron/0.log" Dec 04 13:43:10 crc kubenswrapper[4760]: I1204 13:43:10.861972 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc_6b99a8e4-6932-4867-b485-872dfefcf4fc/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:11 crc kubenswrapper[4760]: I1204 13:43:11.103359 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2526d1e6-dd17-4093-92b2-1bee2a207bac/test-operator-logs-container/0.log" Dec 04 13:43:11 crc kubenswrapper[4760]: I1204 13:43:11.123799 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ef91667b-5e29-49a0-9de9-d557462e96c0/tempest-tests-tempest-tests-runner/0.log" Dec 04 13:43:11 crc kubenswrapper[4760]: I1204 13:43:11.341165 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-59m8w_85fd1b45-21c2-4541-bac9-ce63eddbc242/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:43:16 crc kubenswrapper[4760]: I1204 13:43:16.864341 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:43:17 crc kubenswrapper[4760]: I1204 13:43:17.269148 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01"} Dec 04 13:43:28 crc kubenswrapper[4760]: I1204 13:43:28.741063 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_40657d96-3f5d-4da0-9783-845c41bfeaae/memcached/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.512683 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.712735 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.768911 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.777449 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.935999 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:43:42 crc kubenswrapper[4760]: I1204 13:43:42.978543 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.044910 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/extract/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.174773 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kzkts_69f6297b-8cdc-4bfc-ba61-1868e7805998/kube-rbac-proxy/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.231020 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgdz6_2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd/kube-rbac-proxy/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.296547 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kzkts_69f6297b-8cdc-4bfc-ba61-1868e7805998/manager/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.447911 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgdz6_2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd/manager/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.486019 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-42kkv_9093443e-058e-41f0-81ea-9ff8ba566d8a/kube-rbac-proxy/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.538769 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-42kkv_9093443e-058e-41f0-81ea-9ff8ba566d8a/manager/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.674018 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8c7dd_e8a9a9f4-8e40-4506-9aeb-c3e83d62de39/kube-rbac-proxy/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.797035 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8c7dd_e8a9a9f4-8e40-4506-9aeb-c3e83d62de39/manager/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.875299 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfqvw_abc24b8c-0be3-44c7-b011-5ea10803fdf1/kube-rbac-proxy/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.937247 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfqvw_abc24b8c-0be3-44c7-b011-5ea10803fdf1/manager/0.log" Dec 04 13:43:43 crc kubenswrapper[4760]: I1204 13:43:43.971524 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-5jtgd_4018757f-a398-4734-9a4e-b6cc11327b9f/kube-rbac-proxy/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.088406 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-5jtgd_4018757f-a398-4734-9a4e-b6cc11327b9f/manager/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.194640 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-2ffsj_73caa66c-d120-4b70-b417-d7f363ce6236/kube-rbac-proxy/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.395259 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-thpz6_ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a/kube-rbac-proxy/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.401062 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-2ffsj_73caa66c-d120-4b70-b417-d7f363ce6236/manager/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.403555 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-thpz6_ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a/manager/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.577081 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4866m_35ad36be-f7a9-4ca8-bd29-0d5ccd658c53/kube-rbac-proxy/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.664326 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4866m_35ad36be-f7a9-4ca8-bd29-0d5ccd658c53/manager/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.780479 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-g64p8_b7918a8b-7f47-4d71-820b-95156b273357/kube-rbac-proxy/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.834280 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-g64p8_b7918a8b-7f47-4d71-820b-95156b273357/manager/0.log" Dec 04 13:43:44 crc kubenswrapper[4760]: I1204 13:43:44.897469 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m9phl_546bf075-78c1-4324-ba9a-80ac8df0c4f7/kube-rbac-proxy/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.006623 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m9phl_546bf075-78c1-4324-ba9a-80ac8df0c4f7/manager/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.070615 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x5zgc_282a54d8-5318-49e0-aefe-a86a7a8d63ac/kube-rbac-proxy/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.163029 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x5zgc_282a54d8-5318-49e0-aefe-a86a7a8d63ac/manager/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.253232 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-57pzb_a01a242a-291b-4281-a331-91c05efcdf87/kube-rbac-proxy/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.406796 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-57pzb_a01a242a-291b-4281-a331-91c05efcdf87/manager/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.425929 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-slr7f_f13ef420-321e-40f8-90d2-e6fdcbb72752/kube-rbac-proxy/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.474039 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-slr7f_f13ef420-321e-40f8-90d2-e6fdcbb72752/manager/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.631846 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7_ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9/kube-rbac-proxy/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.640577 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7_ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9/manager/0.log" Dec 04 13:43:45 crc kubenswrapper[4760]: I1204 13:43:45.987641 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-645cc8bbd-8pfbz_91a1898b-cdb0-4f97-9bc0-242d1980bd8c/operator/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.398927 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k2f7j_a4e176f0-09d3-4710-a8a7-32cd09f03c4d/registry-server/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.414028 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-zxmkv_97874450-ee94-4963-aa10-a58295edae62/kube-rbac-proxy/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.667085 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-zxmkv_97874450-ee94-4963-aa10-a58295edae62/manager/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.683096 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ss2gd_21a55b1d-ebff-4abd-a556-d272a1753a5b/manager/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.712942 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ss2gd_21a55b1d-ebff-4abd-a556-d272a1753a5b/kube-rbac-proxy/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.952267 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pptq9_35d340c4-abab-4dc8-8ba4-e8740d6b89d4/operator/0.log" Dec 04 13:43:46 crc kubenswrapper[4760]: I1204 13:43:46.965746 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-q9nzq_fcf368e1-183d-445d-b3b7-dfd4f08fddcd/kube-rbac-proxy/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.112585 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77fb648ff9-cnn8v_bfa893f7-8101-4fd1-ae93-94688b827e95/manager/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.229342 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-q9nzq_fcf368e1-183d-445d-b3b7-dfd4f08fddcd/manager/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.325031 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-twx55_a96cd5b4-6668-4815-b121-777fe0e65833/kube-rbac-proxy/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.409231 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-twx55_a96cd5b4-6668-4815-b121-777fe0e65833/manager/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.508783 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g22dq_750e95e1-6019-4779-a2c0-4abcce4b1c8c/kube-rbac-proxy/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.553139 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g22dq_750e95e1-6019-4779-a2c0-4abcce4b1c8c/manager/0.log" Dec 04 13:43:47 crc kubenswrapper[4760]: I1204 13:43:47.617718 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wwqps_7c8092df-4a88-4a8c-a400-6435f525a5ec/kube-rbac-proxy/0.log" Dec 04 13:43:48 crc kubenswrapper[4760]: I1204 13:43:48.146156 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wwqps_7c8092df-4a88-4a8c-a400-6435f525a5ec/manager/0.log" Dec 04 13:44:07 crc kubenswrapper[4760]: I1204 13:44:07.890685 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nwwhw_fdbd7bc3-cca1-4368-814a-126ba13a4f8e/control-plane-machine-set-operator/0.log" Dec 04 13:44:08 crc kubenswrapper[4760]: I1204 13:44:08.040040 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghltq_a6b7fabb-b00b-41d3-9a63-291959a7c157/machine-api-operator/0.log" Dec 04 13:44:08 crc kubenswrapper[4760]: I1204 13:44:08.119114 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghltq_a6b7fabb-b00b-41d3-9a63-291959a7c157/kube-rbac-proxy/0.log" Dec 04 13:44:20 crc kubenswrapper[4760]: I1204 13:44:20.766731 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-z26nl_7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc/cert-manager-controller/0.log" Dec 04 13:44:20 crc kubenswrapper[4760]: I1204 13:44:20.920105 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q62nq_e6ca39a1-4f59-4e58-85a9-eb60075647a8/cert-manager-cainjector/0.log" Dec 04 13:44:20 crc kubenswrapper[4760]: I1204 13:44:20.984248 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gs5vd_6e691fa7-b524-4703-9ba2-9b5d2936deef/cert-manager-webhook/0.log" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.626961 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:23 crc kubenswrapper[4760]: E1204 13:44:23.628155 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6acc562-1ffc-49a6-a659-0ead796f93ca" containerName="container-00" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.628170 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6acc562-1ffc-49a6-a659-0ead796f93ca" containerName="container-00" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.628421 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6acc562-1ffc-49a6-a659-0ead796f93ca" containerName="container-00" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.629939 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.642409 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.756906 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.756966 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk66p\" (UniqueName: \"kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.757100 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.859412 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.859646 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.859686 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk66p\" (UniqueName: \"kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.860565 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.860706 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.905922 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk66p\" (UniqueName: \"kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p\") pod \"community-operators-ffqpq\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:23 crc kubenswrapper[4760]: I1204 13:44:23.963998 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:24 crc kubenswrapper[4760]: I1204 13:44:24.564533 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:25 crc kubenswrapper[4760]: I1204 13:44:25.023813 4760 generic.go:334] "Generic (PLEG): container finished" podID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerID="97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd" exitCode=0 Dec 04 13:44:25 crc kubenswrapper[4760]: I1204 13:44:25.023866 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerDied","Data":"97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd"} Dec 04 13:44:25 crc kubenswrapper[4760]: I1204 13:44:25.024111 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerStarted","Data":"eef2969af4345d0609f227912b63511a8d36fbaae9d1affdcd9f912df032ca33"} Dec 04 13:44:26 crc kubenswrapper[4760]: I1204 13:44:26.035523 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerStarted","Data":"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5"} Dec 04 13:44:27 crc kubenswrapper[4760]: I1204 13:44:27.045843 4760 generic.go:334] "Generic (PLEG): container finished" podID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerID="bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5" exitCode=0 Dec 04 13:44:27 crc kubenswrapper[4760]: I1204 13:44:27.045945 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerDied","Data":"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5"} Dec 04 13:44:29 crc kubenswrapper[4760]: I1204 13:44:29.066146 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerStarted","Data":"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195"} Dec 04 13:44:29 crc kubenswrapper[4760]: I1204 13:44:29.094456 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ffqpq" podStartSLOduration=3.549516791 podStartE2EDuration="6.094407437s" podCreationTimestamp="2025-12-04 13:44:23 +0000 UTC" firstStartedPulling="2025-12-04 13:44:25.026440436 +0000 UTC m=+5468.067887003" lastFinishedPulling="2025-12-04 13:44:27.571331072 +0000 UTC m=+5470.612777649" observedRunningTime="2025-12-04 13:44:29.085474753 +0000 UTC m=+5472.126921320" watchObservedRunningTime="2025-12-04 13:44:29.094407437 +0000 UTC m=+5472.135854004" Dec 04 13:44:33 crc kubenswrapper[4760]: I1204 13:44:33.965078 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:33 crc kubenswrapper[4760]: I1204 13:44:33.965708 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:34 crc kubenswrapper[4760]: I1204 13:44:34.017146 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:34 crc kubenswrapper[4760]: I1204 13:44:34.164182 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:34 crc kubenswrapper[4760]: I1204 13:44:34.258698 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:34 crc kubenswrapper[4760]: I1204 13:44:34.544341 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-6hrpm_ac746240-d1e4-4a04-98f1-b22871ca58e4/nmstate-console-plugin/0.log" Dec 04 13:44:35 crc kubenswrapper[4760]: I1204 13:44:35.022234 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvpfn_491432c2-b909-4092-a693-409b65208f85/nmstate-handler/0.log" Dec 04 13:44:35 crc kubenswrapper[4760]: I1204 13:44:35.037733 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6wct8_e4f6d4b1-9f69-4970-a2d0-141049cbee82/nmstate-metrics/0.log" Dec 04 13:44:35 crc kubenswrapper[4760]: I1204 13:44:35.051252 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6wct8_e4f6d4b1-9f69-4970-a2d0-141049cbee82/kube-rbac-proxy/0.log" Dec 04 13:44:35 crc kubenswrapper[4760]: I1204 13:44:35.222008 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-vv8cz_a753604d-ead8-4550-be02-3a5ae4827390/nmstate-operator/0.log" Dec 04 13:44:35 crc kubenswrapper[4760]: I1204 13:44:35.317262 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-lmqv2_e9bf818f-d737-4114-a5c3-003834179d27/nmstate-webhook/0.log" Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.128990 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ffqpq" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="registry-server" containerID="cri-o://f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195" gracePeriod=2 Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.653174 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.734387 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content\") pod \"0c073bc5-a35d-4fed-bf34-92a43854a50e\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.734608 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities\") pod \"0c073bc5-a35d-4fed-bf34-92a43854a50e\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.734701 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk66p\" (UniqueName: \"kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p\") pod \"0c073bc5-a35d-4fed-bf34-92a43854a50e\" (UID: \"0c073bc5-a35d-4fed-bf34-92a43854a50e\") " Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.735549 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities" (OuterVolumeSpecName: "utilities") pod "0c073bc5-a35d-4fed-bf34-92a43854a50e" (UID: "0c073bc5-a35d-4fed-bf34-92a43854a50e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.746477 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p" (OuterVolumeSpecName: "kube-api-access-kk66p") pod "0c073bc5-a35d-4fed-bf34-92a43854a50e" (UID: "0c073bc5-a35d-4fed-bf34-92a43854a50e"). InnerVolumeSpecName "kube-api-access-kk66p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.836965 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:44:36 crc kubenswrapper[4760]: I1204 13:44:36.837008 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk66p\" (UniqueName: \"kubernetes.io/projected/0c073bc5-a35d-4fed-bf34-92a43854a50e-kube-api-access-kk66p\") on node \"crc\" DevicePath \"\"" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.139645 4760 generic.go:334] "Generic (PLEG): container finished" podID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerID="f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195" exitCode=0 Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.139706 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerDied","Data":"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195"} Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.139742 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ffqpq" event={"ID":"0c073bc5-a35d-4fed-bf34-92a43854a50e","Type":"ContainerDied","Data":"eef2969af4345d0609f227912b63511a8d36fbaae9d1affdcd9f912df032ca33"} Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.139763 4760 scope.go:117] "RemoveContainer" containerID="f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.139944 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ffqpq" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.163370 4760 scope.go:117] "RemoveContainer" containerID="bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.184227 4760 scope.go:117] "RemoveContainer" containerID="97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.227685 4760 scope.go:117] "RemoveContainer" containerID="f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195" Dec 04 13:44:37 crc kubenswrapper[4760]: E1204 13:44:37.228188 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195\": container with ID starting with f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195 not found: ID does not exist" containerID="f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.228250 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195"} err="failed to get container status \"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195\": rpc error: code = NotFound desc = could not find container \"f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195\": container with ID starting with f8da8678d4e3a69760489f9a22056d92594ca918b52713c7c738cfd9bbdf2195 not found: ID does not exist" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.228279 4760 scope.go:117] "RemoveContainer" containerID="bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5" Dec 04 13:44:37 crc kubenswrapper[4760]: E1204 13:44:37.228607 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5\": container with ID starting with bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5 not found: ID does not exist" containerID="bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.228681 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5"} err="failed to get container status \"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5\": rpc error: code = NotFound desc = could not find container \"bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5\": container with ID starting with bb19bada5f51428344b6f4832172b3315abfdb968243bef7d09331a0da0cf2e5 not found: ID does not exist" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.228722 4760 scope.go:117] "RemoveContainer" containerID="97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd" Dec 04 13:44:37 crc kubenswrapper[4760]: E1204 13:44:37.229080 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd\": container with ID starting with 97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd not found: ID does not exist" containerID="97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.229133 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd"} err="failed to get container status \"97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd\": rpc error: code = NotFound desc = could not find container \"97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd\": container with ID starting with 97773203ad2645c5953613a2747ed6346446867cee101975a44b17ea0fedc8fd not found: ID does not exist" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.434151 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c073bc5-a35d-4fed-bf34-92a43854a50e" (UID: "0c073bc5-a35d-4fed-bf34-92a43854a50e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.450833 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c073bc5-a35d-4fed-bf34-92a43854a50e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.483556 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.495651 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ffqpq"] Dec 04 13:44:37 crc kubenswrapper[4760]: I1204 13:44:37.875763 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" path="/var/lib/kubelet/pods/0c073bc5-a35d-4fed-bf34-92a43854a50e/volumes" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.304332 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9vpvj_66980e43-ab0c-4f0e-a66b-1ba0047809d2/kube-rbac-proxy/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.356456 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9vpvj_66980e43-ab0c-4f0e-a66b-1ba0047809d2/controller/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.476583 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.684885 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.707424 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.719080 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.748501 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.921928 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.957692 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.966133 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:44:50 crc kubenswrapper[4760]: I1204 13:44:50.995412 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.140678 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.161887 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.165510 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.190230 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/controller/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.336914 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/kube-rbac-proxy/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.337487 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/frr-metrics/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.409124 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/kube-rbac-proxy-frr/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.566399 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/reloader/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.658772 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5pb2g_49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae/frr-k8s-webhook-server/0.log" Dec 04 13:44:51 crc kubenswrapper[4760]: I1204 13:44:51.881289 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-645c4f57b7-84fcv_f4735108-14df-4389-af8f-d3e7c56eba8f/manager/0.log" Dec 04 13:44:52 crc kubenswrapper[4760]: I1204 13:44:52.376048 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7578d999c8-hsqg8_ce6e08a4-5fa8-42c9-929d-94af09b81ec2/webhook-server/0.log" Dec 04 13:44:52 crc kubenswrapper[4760]: I1204 13:44:52.383029 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rxp6m_986d9828-3e9c-4b9a-bdc5-aaa3eb184641/kube-rbac-proxy/0.log" Dec 04 13:44:53 crc kubenswrapper[4760]: I1204 13:44:53.133677 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rxp6m_986d9828-3e9c-4b9a-bdc5-aaa3eb184641/speaker/0.log" Dec 04 13:44:53 crc kubenswrapper[4760]: I1204 13:44:53.304093 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/frr/0.log" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.158583 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn"] Dec 04 13:45:00 crc kubenswrapper[4760]: E1204 13:45:00.159875 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="extract-content" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.159892 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="extract-content" Dec 04 13:45:00 crc kubenswrapper[4760]: E1204 13:45:00.159902 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="registry-server" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.159907 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="registry-server" Dec 04 13:45:00 crc kubenswrapper[4760]: E1204 13:45:00.160159 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="extract-utilities" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.160172 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="extract-utilities" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.160507 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c073bc5-a35d-4fed-bf34-92a43854a50e" containerName="registry-server" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.161818 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.169525 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.169852 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.178945 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn"] Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.190615 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdnn\" (UniqueName: \"kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.190795 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.190846 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.293813 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.294353 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.294462 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdnn\" (UniqueName: \"kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.296918 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.308045 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.312040 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdnn\" (UniqueName: \"kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn\") pod \"collect-profiles-29414265-2b8pn\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.495478 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:00 crc kubenswrapper[4760]: I1204 13:45:00.966553 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn"] Dec 04 13:45:01 crc kubenswrapper[4760]: I1204 13:45:01.927370 4760 generic.go:334] "Generic (PLEG): container finished" podID="58dc45cb-45c2-4d74-9038-500ac7400dc6" containerID="c0bf1bedc713716c67a11dbd4b84f183293f55e3d57eb73d121909be7c97325c" exitCode=0 Dec 04 13:45:01 crc kubenswrapper[4760]: I1204 13:45:01.932610 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" event={"ID":"58dc45cb-45c2-4d74-9038-500ac7400dc6","Type":"ContainerDied","Data":"c0bf1bedc713716c67a11dbd4b84f183293f55e3d57eb73d121909be7c97325c"} Dec 04 13:45:01 crc kubenswrapper[4760]: I1204 13:45:01.932685 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" event={"ID":"58dc45cb-45c2-4d74-9038-500ac7400dc6","Type":"ContainerStarted","Data":"77145964281c7d78e68d4e334b382ad65168aa84babbe135970c8e492632d078"} Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.298345 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.440374 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdnn\" (UniqueName: \"kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn\") pod \"58dc45cb-45c2-4d74-9038-500ac7400dc6\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.440851 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume\") pod \"58dc45cb-45c2-4d74-9038-500ac7400dc6\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.441019 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume\") pod \"58dc45cb-45c2-4d74-9038-500ac7400dc6\" (UID: \"58dc45cb-45c2-4d74-9038-500ac7400dc6\") " Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.441494 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume" (OuterVolumeSpecName: "config-volume") pod "58dc45cb-45c2-4d74-9038-500ac7400dc6" (UID: "58dc45cb-45c2-4d74-9038-500ac7400dc6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.441896 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58dc45cb-45c2-4d74-9038-500ac7400dc6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.446512 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn" (OuterVolumeSpecName: "kube-api-access-thdnn") pod "58dc45cb-45c2-4d74-9038-500ac7400dc6" (UID: "58dc45cb-45c2-4d74-9038-500ac7400dc6"). InnerVolumeSpecName "kube-api-access-thdnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.448550 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "58dc45cb-45c2-4d74-9038-500ac7400dc6" (UID: "58dc45cb-45c2-4d74-9038-500ac7400dc6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.544162 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/58dc45cb-45c2-4d74-9038-500ac7400dc6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.544199 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdnn\" (UniqueName: \"kubernetes.io/projected/58dc45cb-45c2-4d74-9038-500ac7400dc6-kube-api-access-thdnn\") on node \"crc\" DevicePath \"\"" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.948282 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" event={"ID":"58dc45cb-45c2-4d74-9038-500ac7400dc6","Type":"ContainerDied","Data":"77145964281c7d78e68d4e334b382ad65168aa84babbe135970c8e492632d078"} Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.948334 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77145964281c7d78e68d4e334b382ad65168aa84babbe135970c8e492632d078" Dec 04 13:45:03 crc kubenswrapper[4760]: I1204 13:45:03.948399 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414265-2b8pn" Dec 04 13:45:04 crc kubenswrapper[4760]: I1204 13:45:04.384506 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf"] Dec 04 13:45:04 crc kubenswrapper[4760]: I1204 13:45:04.392670 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414220-4jjgf"] Dec 04 13:45:05 crc kubenswrapper[4760]: I1204 13:45:05.888950 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258b2022-1b3b-4fc8-a8fb-fa19b450eb98" path="/var/lib/kubelet/pods/258b2022-1b3b-4fc8-a8fb-fa19b450eb98/volumes" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.204104 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.405500 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.423845 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.433226 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.631826 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/extract/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.643345 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.662982 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:45:07 crc kubenswrapper[4760]: I1204 13:45:07.837367 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.103965 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.116404 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.118097 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.290622 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/extract/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.325872 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.338262 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.525367 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.726277 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.732120 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.744512 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.888911 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:45:08 crc kubenswrapper[4760]: I1204 13:45:08.923683 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:45:09 crc kubenswrapper[4760]: I1204 13:45:09.213108 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:45:09 crc kubenswrapper[4760]: I1204 13:45:09.743898 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/registry-server/0.log" Dec 04 13:45:09 crc kubenswrapper[4760]: I1204 13:45:09.882621 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:45:09 crc kubenswrapper[4760]: I1204 13:45:09.903979 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.045947 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.214230 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.241249 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.572997 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.586928 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xj92j_df130580-94d3-40cd-a840-c85281e78fcc/marketplace-operator/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.865824 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.954652 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.993439 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:45:10 crc kubenswrapper[4760]: I1204 13:45:10.997642 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/registry-server/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.232872 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.259119 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.516644 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.533780 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/registry-server/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.692141 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.715460 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.724580 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.891827 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:45:11 crc kubenswrapper[4760]: I1204 13:45:11.899510 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:45:12 crc kubenswrapper[4760]: I1204 13:45:12.655490 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/registry-server/0.log" Dec 04 13:45:33 crc kubenswrapper[4760]: I1204 13:45:33.380836 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:45:33 crc kubenswrapper[4760]: I1204 13:45:33.381448 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:45:46 crc kubenswrapper[4760]: E1204 13:45:46.094408 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.107:37784->38.102.83.107:35609: write tcp 38.102.83.107:37784->38.102.83.107:35609: write: broken pipe Dec 04 13:45:49 crc kubenswrapper[4760]: E1204 13:45:49.088335 4760 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.107:37954->38.102.83.107:35609: write tcp 38.102.83.107:37954->38.102.83.107:35609: write: broken pipe Dec 04 13:46:02 crc kubenswrapper[4760]: I1204 13:46:02.194015 4760 scope.go:117] "RemoveContainer" containerID="5ed4b9c46ab431b86548ccc97f133ee0f1e1c1b44af1316faa545d3adc41d5a9" Dec 04 13:46:03 crc kubenswrapper[4760]: I1204 13:46:03.380609 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:46:03 crc kubenswrapper[4760]: I1204 13:46:03.381750 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.339451 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:19 crc kubenswrapper[4760]: E1204 13:46:19.340551 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dc45cb-45c2-4d74-9038-500ac7400dc6" containerName="collect-profiles" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.340575 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dc45cb-45c2-4d74-9038-500ac7400dc6" containerName="collect-profiles" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.340897 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dc45cb-45c2-4d74-9038-500ac7400dc6" containerName="collect-profiles" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.342911 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.362322 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.447355 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwd5\" (UniqueName: \"kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.447438 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.447569 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.549744 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.550120 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwd5\" (UniqueName: \"kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.550167 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.550410 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.550622 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.576433 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwd5\" (UniqueName: \"kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5\") pod \"redhat-operators-hv6ds\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:19 crc kubenswrapper[4760]: I1204 13:46:19.688785 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:20 crc kubenswrapper[4760]: I1204 13:46:20.184888 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:20 crc kubenswrapper[4760]: I1204 13:46:20.677267 4760 generic.go:334] "Generic (PLEG): container finished" podID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerID="505efab70dd32e6f0d2214514f58a24df98975a2633f42972fc92318d2d9e6c1" exitCode=0 Dec 04 13:46:20 crc kubenswrapper[4760]: I1204 13:46:20.677358 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerDied","Data":"505efab70dd32e6f0d2214514f58a24df98975a2633f42972fc92318d2d9e6c1"} Dec 04 13:46:20 crc kubenswrapper[4760]: I1204 13:46:20.677672 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerStarted","Data":"dff2ace6e2e8182863c141d7a7cc11bb3a52531312bf9c2f6343f1633c18eeec"} Dec 04 13:46:20 crc kubenswrapper[4760]: I1204 13:46:20.679632 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:46:21 crc kubenswrapper[4760]: I1204 13:46:21.693456 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerStarted","Data":"715c40adec7fad424fdf564fb2054aaa29ee9141ee9e46cd2cde4c09dc011196"} Dec 04 13:46:22 crc kubenswrapper[4760]: I1204 13:46:22.705676 4760 generic.go:334] "Generic (PLEG): container finished" podID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerID="715c40adec7fad424fdf564fb2054aaa29ee9141ee9e46cd2cde4c09dc011196" exitCode=0 Dec 04 13:46:22 crc kubenswrapper[4760]: I1204 13:46:22.705728 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerDied","Data":"715c40adec7fad424fdf564fb2054aaa29ee9141ee9e46cd2cde4c09dc011196"} Dec 04 13:46:23 crc kubenswrapper[4760]: I1204 13:46:23.717792 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerStarted","Data":"b403da8727619cef8e868e16939e077b17ec4852bdeeccc7ffd5a9811acec6ba"} Dec 04 13:46:23 crc kubenswrapper[4760]: I1204 13:46:23.744320 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hv6ds" podStartSLOduration=2.039472894 podStartE2EDuration="4.744250976s" podCreationTimestamp="2025-12-04 13:46:19 +0000 UTC" firstStartedPulling="2025-12-04 13:46:20.679271728 +0000 UTC m=+5583.720718295" lastFinishedPulling="2025-12-04 13:46:23.38404981 +0000 UTC m=+5586.425496377" observedRunningTime="2025-12-04 13:46:23.740112357 +0000 UTC m=+5586.781558924" watchObservedRunningTime="2025-12-04 13:46:23.744250976 +0000 UTC m=+5586.785697543" Dec 04 13:46:29 crc kubenswrapper[4760]: I1204 13:46:29.689551 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:29 crc kubenswrapper[4760]: I1204 13:46:29.690725 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:29 crc kubenswrapper[4760]: I1204 13:46:29.747621 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:29 crc kubenswrapper[4760]: I1204 13:46:29.820372 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:29 crc kubenswrapper[4760]: I1204 13:46:29.994606 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:31 crc kubenswrapper[4760]: I1204 13:46:31.917101 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hv6ds" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="registry-server" containerID="cri-o://b403da8727619cef8e868e16939e077b17ec4852bdeeccc7ffd5a9811acec6ba" gracePeriod=2 Dec 04 13:46:33 crc kubenswrapper[4760]: I1204 13:46:33.379900 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:46:33 crc kubenswrapper[4760]: I1204 13:46:33.380241 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:46:33 crc kubenswrapper[4760]: I1204 13:46:33.380291 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:46:33 crc kubenswrapper[4760]: I1204 13:46:33.381427 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:46:33 crc kubenswrapper[4760]: I1204 13:46:33.381492 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01" gracePeriod=600 Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.948061 4760 generic.go:334] "Generic (PLEG): container finished" podID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerID="b403da8727619cef8e868e16939e077b17ec4852bdeeccc7ffd5a9811acec6ba" exitCode=0 Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.948648 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerDied","Data":"b403da8727619cef8e868e16939e077b17ec4852bdeeccc7ffd5a9811acec6ba"} Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.951067 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01" exitCode=0 Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.952984 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01"} Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.953028 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f"} Dec 04 13:46:34 crc kubenswrapper[4760]: I1204 13:46:34.953050 4760 scope.go:117] "RemoveContainer" containerID="a5a210cb7921afef481012f0cc7b80e9eb959223236ab4e18fe31117aa825d6f" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.111807 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.295719 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content\") pod \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.295806 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkwd5\" (UniqueName: \"kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5\") pod \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.295842 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities\") pod \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\" (UID: \"2eba4a2e-bbe0-4df2-b85b-53ec3300328e\") " Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.297327 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities" (OuterVolumeSpecName: "utilities") pod "2eba4a2e-bbe0-4df2-b85b-53ec3300328e" (UID: "2eba4a2e-bbe0-4df2-b85b-53ec3300328e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.302608 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5" (OuterVolumeSpecName: "kube-api-access-lkwd5") pod "2eba4a2e-bbe0-4df2-b85b-53ec3300328e" (UID: "2eba4a2e-bbe0-4df2-b85b-53ec3300328e"). InnerVolumeSpecName "kube-api-access-lkwd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.398186 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkwd5\" (UniqueName: \"kubernetes.io/projected/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-kube-api-access-lkwd5\") on node \"crc\" DevicePath \"\"" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.398235 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.402840 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eba4a2e-bbe0-4df2-b85b-53ec3300328e" (UID: "2eba4a2e-bbe0-4df2-b85b-53ec3300328e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.499898 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eba4a2e-bbe0-4df2-b85b-53ec3300328e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.965348 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hv6ds" event={"ID":"2eba4a2e-bbe0-4df2-b85b-53ec3300328e","Type":"ContainerDied","Data":"dff2ace6e2e8182863c141d7a7cc11bb3a52531312bf9c2f6343f1633c18eeec"} Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.965404 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hv6ds" Dec 04 13:46:35 crc kubenswrapper[4760]: I1204 13:46:35.965411 4760 scope.go:117] "RemoveContainer" containerID="b403da8727619cef8e868e16939e077b17ec4852bdeeccc7ffd5a9811acec6ba" Dec 04 13:46:36 crc kubenswrapper[4760]: I1204 13:46:36.000140 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:36 crc kubenswrapper[4760]: I1204 13:46:36.003936 4760 scope.go:117] "RemoveContainer" containerID="715c40adec7fad424fdf564fb2054aaa29ee9141ee9e46cd2cde4c09dc011196" Dec 04 13:46:36 crc kubenswrapper[4760]: I1204 13:46:36.016316 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hv6ds"] Dec 04 13:46:36 crc kubenswrapper[4760]: I1204 13:46:36.032115 4760 scope.go:117] "RemoveContainer" containerID="505efab70dd32e6f0d2214514f58a24df98975a2633f42972fc92318d2d9e6c1" Dec 04 13:46:37 crc kubenswrapper[4760]: I1204 13:46:37.875335 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" path="/var/lib/kubelet/pods/2eba4a2e-bbe0-4df2-b85b-53ec3300328e/volumes" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.074545 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:46:59 crc kubenswrapper[4760]: E1204 13:46:59.075594 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="extract-content" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.075612 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="extract-content" Dec 04 13:46:59 crc kubenswrapper[4760]: E1204 13:46:59.075631 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="extract-utilities" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.075639 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="extract-utilities" Dec 04 13:46:59 crc kubenswrapper[4760]: E1204 13:46:59.075680 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="registry-server" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.075689 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="registry-server" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.075928 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eba4a2e-bbe0-4df2-b85b-53ec3300328e" containerName="registry-server" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.078050 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.095492 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.180829 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.181034 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.181749 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9kf\" (UniqueName: \"kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.283435 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.283668 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9kf\" (UniqueName: \"kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.283706 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.284255 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.284550 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.319953 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9kf\" (UniqueName: \"kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf\") pod \"redhat-marketplace-c9t9g\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.404051 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:46:59 crc kubenswrapper[4760]: I1204 13:46:59.980912 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:47:00 crc kubenswrapper[4760]: I1204 13:47:00.223295 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerStarted","Data":"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86"} Dec 04 13:47:00 crc kubenswrapper[4760]: I1204 13:47:00.224040 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerStarted","Data":"dbd4e52f3a570692837bce89f571060492924c8fdf649de95c8e4b6b11ca02d8"} Dec 04 13:47:01 crc kubenswrapper[4760]: I1204 13:47:01.236656 4760 generic.go:334] "Generic (PLEG): container finished" podID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerID="8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86" exitCode=0 Dec 04 13:47:01 crc kubenswrapper[4760]: I1204 13:47:01.236752 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerDied","Data":"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86"} Dec 04 13:47:02 crc kubenswrapper[4760]: I1204 13:47:02.249307 4760 generic.go:334] "Generic (PLEG): container finished" podID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerID="632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79" exitCode=0 Dec 04 13:47:02 crc kubenswrapper[4760]: I1204 13:47:02.249346 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerDied","Data":"632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79"} Dec 04 13:47:03 crc kubenswrapper[4760]: I1204 13:47:03.265631 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerStarted","Data":"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c"} Dec 04 13:47:03 crc kubenswrapper[4760]: I1204 13:47:03.295405 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9t9g" podStartSLOduration=1.636482466 podStartE2EDuration="4.29538489s" podCreationTimestamp="2025-12-04 13:46:59 +0000 UTC" firstStartedPulling="2025-12-04 13:47:00.225779637 +0000 UTC m=+5623.267226204" lastFinishedPulling="2025-12-04 13:47:02.884682061 +0000 UTC m=+5625.926128628" observedRunningTime="2025-12-04 13:47:03.29123245 +0000 UTC m=+5626.332679007" watchObservedRunningTime="2025-12-04 13:47:03.29538489 +0000 UTC m=+5626.336831457" Dec 04 13:47:09 crc kubenswrapper[4760]: I1204 13:47:09.404394 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:09 crc kubenswrapper[4760]: I1204 13:47:09.405065 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:09 crc kubenswrapper[4760]: I1204 13:47:09.458967 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:10 crc kubenswrapper[4760]: I1204 13:47:10.394990 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:10 crc kubenswrapper[4760]: I1204 13:47:10.453851 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.349487 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c9t9g" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="registry-server" containerID="cri-o://9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c" gracePeriod=2 Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.828163 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.974621 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9kf\" (UniqueName: \"kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf\") pod \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.975001 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities\") pod \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.975193 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content\") pod \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\" (UID: \"70929ae1-f1a9-4f2e-b541-03aeb46e8a85\") " Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.976302 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities" (OuterVolumeSpecName: "utilities") pod "70929ae1-f1a9-4f2e-b541-03aeb46e8a85" (UID: "70929ae1-f1a9-4f2e-b541-03aeb46e8a85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.986571 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf" (OuterVolumeSpecName: "kube-api-access-bz9kf") pod "70929ae1-f1a9-4f2e-b541-03aeb46e8a85" (UID: "70929ae1-f1a9-4f2e-b541-03aeb46e8a85"). InnerVolumeSpecName "kube-api-access-bz9kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:47:12 crc kubenswrapper[4760]: I1204 13:47:12.994754 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70929ae1-f1a9-4f2e-b541-03aeb46e8a85" (UID: "70929ae1-f1a9-4f2e-b541-03aeb46e8a85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.077526 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.077559 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.077571 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9kf\" (UniqueName: \"kubernetes.io/projected/70929ae1-f1a9-4f2e-b541-03aeb46e8a85-kube-api-access-bz9kf\") on node \"crc\" DevicePath \"\"" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.360708 4760 generic.go:334] "Generic (PLEG): container finished" podID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerID="9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c" exitCode=0 Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.360757 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerDied","Data":"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c"} Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.360791 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9t9g" event={"ID":"70929ae1-f1a9-4f2e-b541-03aeb46e8a85","Type":"ContainerDied","Data":"dbd4e52f3a570692837bce89f571060492924c8fdf649de95c8e4b6b11ca02d8"} Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.360812 4760 scope.go:117] "RemoveContainer" containerID="9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.360965 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9t9g" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.384975 4760 scope.go:117] "RemoveContainer" containerID="632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.412693 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.423395 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9t9g"] Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.429253 4760 scope.go:117] "RemoveContainer" containerID="8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.457369 4760 scope.go:117] "RemoveContainer" containerID="9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c" Dec 04 13:47:13 crc kubenswrapper[4760]: E1204 13:47:13.458978 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c\": container with ID starting with 9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c not found: ID does not exist" containerID="9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.459176 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c"} err="failed to get container status \"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c\": rpc error: code = NotFound desc = could not find container \"9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c\": container with ID starting with 9adf785a2e2ec4dfad612c40b2746a295e209890a05a175d66c2519d679f622c not found: ID does not exist" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.459221 4760 scope.go:117] "RemoveContainer" containerID="632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79" Dec 04 13:47:13 crc kubenswrapper[4760]: E1204 13:47:13.459970 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79\": container with ID starting with 632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79 not found: ID does not exist" containerID="632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.460034 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79"} err="failed to get container status \"632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79\": rpc error: code = NotFound desc = could not find container \"632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79\": container with ID starting with 632642834ff5f3c0c09723dfea72617748bcaa695fc5e2b4a378225e8368bb79 not found: ID does not exist" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.460070 4760 scope.go:117] "RemoveContainer" containerID="8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86" Dec 04 13:47:13 crc kubenswrapper[4760]: E1204 13:47:13.460401 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86\": container with ID starting with 8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86 not found: ID does not exist" containerID="8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.460441 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86"} err="failed to get container status \"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86\": rpc error: code = NotFound desc = could not find container \"8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86\": container with ID starting with 8922e88dd0623cac47799333fea1b1e28122f266802a5ec1806061a7b8fa5d86 not found: ID does not exist" Dec 04 13:47:13 crc kubenswrapper[4760]: I1204 13:47:13.876199 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" path="/var/lib/kubelet/pods/70929ae1-f1a9-4f2e-b541-03aeb46e8a85/volumes" Dec 04 13:47:37 crc kubenswrapper[4760]: I1204 13:47:37.618497 4760 generic.go:334] "Generic (PLEG): container finished" podID="f52bc647-d752-45f2-a391-2d676657775b" containerID="4b3f90b5dd6589eb19cf6c140e6095a97bb1a04607b02f1771df6261af15dcba" exitCode=0 Dec 04 13:47:37 crc kubenswrapper[4760]: I1204 13:47:37.618567 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" event={"ID":"f52bc647-d752-45f2-a391-2d676657775b","Type":"ContainerDied","Data":"4b3f90b5dd6589eb19cf6c140e6095a97bb1a04607b02f1771df6261af15dcba"} Dec 04 13:47:37 crc kubenswrapper[4760]: I1204 13:47:37.620524 4760 scope.go:117] "RemoveContainer" containerID="4b3f90b5dd6589eb19cf6c140e6095a97bb1a04607b02f1771df6261af15dcba" Dec 04 13:47:38 crc kubenswrapper[4760]: I1204 13:47:38.358653 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tm5jt_must-gather-rwfhl_f52bc647-d752-45f2-a391-2d676657775b/gather/0.log" Dec 04 13:47:47 crc kubenswrapper[4760]: I1204 13:47:47.349077 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tm5jt/must-gather-rwfhl"] Dec 04 13:47:47 crc kubenswrapper[4760]: I1204 13:47:47.349910 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="copy" containerID="cri-o://889962dedac26d990a7ca1811915317a883c424a22914ba7b74a7d4787cc1555" gracePeriod=2 Dec 04 13:47:47 crc kubenswrapper[4760]: I1204 13:47:47.360465 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tm5jt/must-gather-rwfhl"] Dec 04 13:47:47 crc kubenswrapper[4760]: I1204 13:47:47.733750 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tm5jt_must-gather-rwfhl_f52bc647-d752-45f2-a391-2d676657775b/copy/0.log" Dec 04 13:47:47 crc kubenswrapper[4760]: I1204 13:47:47.734368 4760 generic.go:334] "Generic (PLEG): container finished" podID="f52bc647-d752-45f2-a391-2d676657775b" containerID="889962dedac26d990a7ca1811915317a883c424a22914ba7b74a7d4787cc1555" exitCode=143 Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.064353 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tm5jt_must-gather-rwfhl_f52bc647-d752-45f2-a391-2d676657775b/copy/0.log" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.065137 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.088141 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output\") pod \"f52bc647-d752-45f2-a391-2d676657775b\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.088288 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9kn\" (UniqueName: \"kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn\") pod \"f52bc647-d752-45f2-a391-2d676657775b\" (UID: \"f52bc647-d752-45f2-a391-2d676657775b\") " Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.097625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn" (OuterVolumeSpecName: "kube-api-access-ss9kn") pod "f52bc647-d752-45f2-a391-2d676657775b" (UID: "f52bc647-d752-45f2-a391-2d676657775b"). InnerVolumeSpecName "kube-api-access-ss9kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.190658 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9kn\" (UniqueName: \"kubernetes.io/projected/f52bc647-d752-45f2-a391-2d676657775b-kube-api-access-ss9kn\") on node \"crc\" DevicePath \"\"" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.279693 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f52bc647-d752-45f2-a391-2d676657775b" (UID: "f52bc647-d752-45f2-a391-2d676657775b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.293368 4760 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f52bc647-d752-45f2-a391-2d676657775b-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.745398 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tm5jt_must-gather-rwfhl_f52bc647-d752-45f2-a391-2d676657775b/copy/0.log" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.746117 4760 scope.go:117] "RemoveContainer" containerID="889962dedac26d990a7ca1811915317a883c424a22914ba7b74a7d4787cc1555" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.746139 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tm5jt/must-gather-rwfhl" Dec 04 13:47:48 crc kubenswrapper[4760]: I1204 13:47:48.769063 4760 scope.go:117] "RemoveContainer" containerID="4b3f90b5dd6589eb19cf6c140e6095a97bb1a04607b02f1771df6261af15dcba" Dec 04 13:47:49 crc kubenswrapper[4760]: I1204 13:47:49.874931 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52bc647-d752-45f2-a391-2d676657775b" path="/var/lib/kubelet/pods/f52bc647-d752-45f2-a391-2d676657775b/volumes" Dec 04 13:48:02 crc kubenswrapper[4760]: I1204 13:48:02.297382 4760 scope.go:117] "RemoveContainer" containerID="45f5259f59239f0de993b5859e01ae4c095bf63db3bfb3fd87f01b7b872f0488" Dec 04 13:49:02 crc kubenswrapper[4760]: I1204 13:49:02.396830 4760 scope.go:117] "RemoveContainer" containerID="7f172888436492b34aab61b84d2f8622b96cd2721c16ab0b1b4dc6d640122ffe" Dec 04 13:49:03 crc kubenswrapper[4760]: I1204 13:49:03.380645 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:49:03 crc kubenswrapper[4760]: I1204 13:49:03.381223 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:49:33 crc kubenswrapper[4760]: I1204 13:49:33.380140 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:49:33 crc kubenswrapper[4760]: I1204 13:49:33.380741 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:50:03 crc kubenswrapper[4760]: I1204 13:50:03.380739 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:50:03 crc kubenswrapper[4760]: I1204 13:50:03.381319 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:50:03 crc kubenswrapper[4760]: I1204 13:50:03.381379 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:50:03 crc kubenswrapper[4760]: I1204 13:50:03.382354 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:50:03 crc kubenswrapper[4760]: I1204 13:50:03.382428 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" gracePeriod=600 Dec 04 13:50:03 crc kubenswrapper[4760]: E1204 13:50:03.504775 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:50:04 crc kubenswrapper[4760]: I1204 13:50:04.141572 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" exitCode=0 Dec 04 13:50:04 crc kubenswrapper[4760]: I1204 13:50:04.141693 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f"} Dec 04 13:50:04 crc kubenswrapper[4760]: I1204 13:50:04.141975 4760 scope.go:117] "RemoveContainer" containerID="d08476c52d8ebac0dd6f63e9a6446171d7ea6de161be3bbf19b17d7a0d8daa01" Dec 04 13:50:04 crc kubenswrapper[4760]: I1204 13:50:04.142925 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:50:04 crc kubenswrapper[4760]: E1204 13:50:04.143342 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:50:19 crc kubenswrapper[4760]: I1204 13:50:19.870467 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:50:19 crc kubenswrapper[4760]: E1204 13:50:19.871546 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:50:33 crc kubenswrapper[4760]: I1204 13:50:33.864864 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:50:33 crc kubenswrapper[4760]: E1204 13:50:33.865643 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:50:46 crc kubenswrapper[4760]: I1204 13:50:46.864110 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:50:46 crc kubenswrapper[4760]: E1204 13:50:46.864891 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:50:58 crc kubenswrapper[4760]: I1204 13:50:58.864959 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:50:58 crc kubenswrapper[4760]: E1204 13:50:58.865852 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.564888 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvshw/must-gather-rjrw2"] Dec 04 13:51:03 crc kubenswrapper[4760]: E1204 13:51:03.566208 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="extract-content" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566246 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="extract-content" Dec 04 13:51:03 crc kubenswrapper[4760]: E1204 13:51:03.566281 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="gather" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566289 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="gather" Dec 04 13:51:03 crc kubenswrapper[4760]: E1204 13:51:03.566302 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="extract-utilities" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566311 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="extract-utilities" Dec 04 13:51:03 crc kubenswrapper[4760]: E1204 13:51:03.566321 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="copy" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566328 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="copy" Dec 04 13:51:03 crc kubenswrapper[4760]: E1204 13:51:03.566352 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="registry-server" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566360 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="registry-server" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566619 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="copy" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566636 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="70929ae1-f1a9-4f2e-b541-03aeb46e8a85" containerName="registry-server" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.566693 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52bc647-d752-45f2-a391-2d676657775b" containerName="gather" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.567971 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.574673 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kvshw"/"openshift-service-ca.crt" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.574674 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kvshw"/"kube-root-ca.crt" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.630497 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.630896 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9z5\" (UniqueName: \"kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.656741 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvshw/must-gather-rjrw2"] Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.733494 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.733830 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9z5\" (UniqueName: \"kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.734016 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.766204 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9z5\" (UniqueName: \"kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5\") pod \"must-gather-rjrw2\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:03 crc kubenswrapper[4760]: I1204 13:51:03.900531 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:51:04 crc kubenswrapper[4760]: I1204 13:51:04.401328 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvshw/must-gather-rjrw2"] Dec 04 13:51:04 crc kubenswrapper[4760]: I1204 13:51:04.718391 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/must-gather-rjrw2" event={"ID":"79911128-6dfb-4fa9-b375-29ad707c556c","Type":"ContainerStarted","Data":"137f7c821bd109de47d440af598b9b52ffbf7792347105c2891d3384d7b75782"} Dec 04 13:51:05 crc kubenswrapper[4760]: I1204 13:51:05.763751 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/must-gather-rjrw2" event={"ID":"79911128-6dfb-4fa9-b375-29ad707c556c","Type":"ContainerStarted","Data":"c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f"} Dec 04 13:51:05 crc kubenswrapper[4760]: I1204 13:51:05.764364 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/must-gather-rjrw2" event={"ID":"79911128-6dfb-4fa9-b375-29ad707c556c","Type":"ContainerStarted","Data":"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af"} Dec 04 13:51:05 crc kubenswrapper[4760]: I1204 13:51:05.801068 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kvshw/must-gather-rjrw2" podStartSLOduration=2.801029068 podStartE2EDuration="2.801029068s" podCreationTimestamp="2025-12-04 13:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 13:51:05.781733383 +0000 UTC m=+5868.823179950" watchObservedRunningTime="2025-12-04 13:51:05.801029068 +0000 UTC m=+5868.842475625" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.162727 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvshw/crc-debug-c9x5t"] Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.164894 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.167859 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kvshw"/"default-dockercfg-k8wvs" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.339541 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zp2n\" (UniqueName: \"kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.339663 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.442411 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zp2n\" (UniqueName: \"kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.442558 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.442745 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.471410 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zp2n\" (UniqueName: \"kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n\") pod \"crc-debug-c9x5t\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.493411 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:10 crc kubenswrapper[4760]: W1204 13:51:10.545775 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab082a0_cf7d_4664_8d7f_03005eb986a0.slice/crio-c603db732d2b202a5a9f2427554ab62ae4e1a73f68dfa2f4ed7dac45afaf6fc7 WatchSource:0}: Error finding container c603db732d2b202a5a9f2427554ab62ae4e1a73f68dfa2f4ed7dac45afaf6fc7: Status 404 returned error can't find the container with id c603db732d2b202a5a9f2427554ab62ae4e1a73f68dfa2f4ed7dac45afaf6fc7 Dec 04 13:51:10 crc kubenswrapper[4760]: I1204 13:51:10.817460 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" event={"ID":"3ab082a0-cf7d-4664-8d7f-03005eb986a0","Type":"ContainerStarted","Data":"c603db732d2b202a5a9f2427554ab62ae4e1a73f68dfa2f4ed7dac45afaf6fc7"} Dec 04 13:51:11 crc kubenswrapper[4760]: I1204 13:51:11.833362 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" event={"ID":"3ab082a0-cf7d-4664-8d7f-03005eb986a0","Type":"ContainerStarted","Data":"1a8606bd08bdbca46334b95166723a27bb9285fa05c856af7035dbc2d03e8a47"} Dec 04 13:51:11 crc kubenswrapper[4760]: I1204 13:51:11.857759 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" podStartSLOduration=1.857738479 podStartE2EDuration="1.857738479s" podCreationTimestamp="2025-12-04 13:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 13:51:11.847779627 +0000 UTC m=+5874.889226194" watchObservedRunningTime="2025-12-04 13:51:11.857738479 +0000 UTC m=+5874.899185046" Dec 04 13:51:12 crc kubenswrapper[4760]: I1204 13:51:12.866268 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:51:12 crc kubenswrapper[4760]: E1204 13:51:12.866827 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:51:23 crc kubenswrapper[4760]: I1204 13:51:23.865120 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:51:23 crc kubenswrapper[4760]: E1204 13:51:23.865977 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:51:35 crc kubenswrapper[4760]: I1204 13:51:35.873751 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:51:35 crc kubenswrapper[4760]: E1204 13:51:35.874816 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:51:48 crc kubenswrapper[4760]: I1204 13:51:48.865089 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:51:48 crc kubenswrapper[4760]: E1204 13:51:48.866068 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:51:52 crc kubenswrapper[4760]: I1204 13:51:52.278601 4760 generic.go:334] "Generic (PLEG): container finished" podID="3ab082a0-cf7d-4664-8d7f-03005eb986a0" containerID="1a8606bd08bdbca46334b95166723a27bb9285fa05c856af7035dbc2d03e8a47" exitCode=0 Dec 04 13:51:52 crc kubenswrapper[4760]: I1204 13:51:52.278709 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" event={"ID":"3ab082a0-cf7d-4664-8d7f-03005eb986a0","Type":"ContainerDied","Data":"1a8606bd08bdbca46334b95166723a27bb9285fa05c856af7035dbc2d03e8a47"} Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.386935 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.426791 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-c9x5t"] Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.436035 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-c9x5t"] Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.482719 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host\") pod \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.482853 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zp2n\" (UniqueName: \"kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n\") pod \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\" (UID: \"3ab082a0-cf7d-4664-8d7f-03005eb986a0\") " Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.482872 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host" (OuterVolumeSpecName: "host") pod "3ab082a0-cf7d-4664-8d7f-03005eb986a0" (UID: "3ab082a0-cf7d-4664-8d7f-03005eb986a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.483401 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ab082a0-cf7d-4664-8d7f-03005eb986a0-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.488508 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n" (OuterVolumeSpecName: "kube-api-access-5zp2n") pod "3ab082a0-cf7d-4664-8d7f-03005eb986a0" (UID: "3ab082a0-cf7d-4664-8d7f-03005eb986a0"). InnerVolumeSpecName "kube-api-access-5zp2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.585172 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zp2n\" (UniqueName: \"kubernetes.io/projected/3ab082a0-cf7d-4664-8d7f-03005eb986a0-kube-api-access-5zp2n\") on node \"crc\" DevicePath \"\"" Dec 04 13:51:53 crc kubenswrapper[4760]: I1204 13:51:53.875677 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab082a0-cf7d-4664-8d7f-03005eb986a0" path="/var/lib/kubelet/pods/3ab082a0-cf7d-4664-8d7f-03005eb986a0/volumes" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.296470 4760 scope.go:117] "RemoveContainer" containerID="1a8606bd08bdbca46334b95166723a27bb9285fa05c856af7035dbc2d03e8a47" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.296567 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-c9x5t" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.580483 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvshw/crc-debug-5pmft"] Dec 04 13:51:54 crc kubenswrapper[4760]: E1204 13:51:54.581359 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab082a0-cf7d-4664-8d7f-03005eb986a0" containerName="container-00" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.581374 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab082a0-cf7d-4664-8d7f-03005eb986a0" containerName="container-00" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.581588 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab082a0-cf7d-4664-8d7f-03005eb986a0" containerName="container-00" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.582338 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.586471 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kvshw"/"default-dockercfg-k8wvs" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.709638 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjst\" (UniqueName: \"kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.709936 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.811497 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.811605 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjst\" (UniqueName: \"kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.812084 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.830990 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjst\" (UniqueName: \"kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst\") pod \"crc-debug-5pmft\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:54 crc kubenswrapper[4760]: I1204 13:51:54.901736 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:55 crc kubenswrapper[4760]: I1204 13:51:55.312983 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-5pmft" event={"ID":"cb9852d3-8dcb-4720-8e27-450d532bcfcf","Type":"ContainerStarted","Data":"b41980ca2dee1cac29b052ab11c07aa22d3f7a7a0ca43f37339c11ab9ce93dca"} Dec 04 13:51:56 crc kubenswrapper[4760]: I1204 13:51:56.322489 4760 generic.go:334] "Generic (PLEG): container finished" podID="cb9852d3-8dcb-4720-8e27-450d532bcfcf" containerID="a2c6f49e07ce79f11cb369d0bbca8217cedf34526579d5df5e425235707686e8" exitCode=0 Dec 04 13:51:56 crc kubenswrapper[4760]: I1204 13:51:56.322773 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-5pmft" event={"ID":"cb9852d3-8dcb-4720-8e27-450d532bcfcf","Type":"ContainerDied","Data":"a2c6f49e07ce79f11cb369d0bbca8217cedf34526579d5df5e425235707686e8"} Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.485284 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.571898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host\") pod \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.572002 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsjst\" (UniqueName: \"kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst\") pod \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\" (UID: \"cb9852d3-8dcb-4720-8e27-450d532bcfcf\") " Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.572785 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host" (OuterVolumeSpecName: "host") pod "cb9852d3-8dcb-4720-8e27-450d532bcfcf" (UID: "cb9852d3-8dcb-4720-8e27-450d532bcfcf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.573887 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb9852d3-8dcb-4720-8e27-450d532bcfcf-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.594709 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst" (OuterVolumeSpecName: "kube-api-access-bsjst") pod "cb9852d3-8dcb-4720-8e27-450d532bcfcf" (UID: "cb9852d3-8dcb-4720-8e27-450d532bcfcf"). InnerVolumeSpecName "kube-api-access-bsjst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:51:57 crc kubenswrapper[4760]: I1204 13:51:57.675639 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsjst\" (UniqueName: \"kubernetes.io/projected/cb9852d3-8dcb-4720-8e27-450d532bcfcf-kube-api-access-bsjst\") on node \"crc\" DevicePath \"\"" Dec 04 13:51:58 crc kubenswrapper[4760]: I1204 13:51:58.346693 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-5pmft" event={"ID":"cb9852d3-8dcb-4720-8e27-450d532bcfcf","Type":"ContainerDied","Data":"b41980ca2dee1cac29b052ab11c07aa22d3f7a7a0ca43f37339c11ab9ce93dca"} Dec 04 13:51:58 crc kubenswrapper[4760]: I1204 13:51:58.347058 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b41980ca2dee1cac29b052ab11c07aa22d3f7a7a0ca43f37339c11ab9ce93dca" Dec 04 13:51:58 crc kubenswrapper[4760]: I1204 13:51:58.347127 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-5pmft" Dec 04 13:51:58 crc kubenswrapper[4760]: I1204 13:51:58.927350 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-5pmft"] Dec 04 13:51:58 crc kubenswrapper[4760]: I1204 13:51:58.941317 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-5pmft"] Dec 04 13:51:59 crc kubenswrapper[4760]: I1204 13:51:59.877538 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9852d3-8dcb-4720-8e27-450d532bcfcf" path="/var/lib/kubelet/pods/cb9852d3-8dcb-4720-8e27-450d532bcfcf/volumes" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.147881 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvshw/crc-debug-gd6rn"] Dec 04 13:52:00 crc kubenswrapper[4760]: E1204 13:52:00.148438 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9852d3-8dcb-4720-8e27-450d532bcfcf" containerName="container-00" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.148452 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9852d3-8dcb-4720-8e27-450d532bcfcf" containerName="container-00" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.148656 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9852d3-8dcb-4720-8e27-450d532bcfcf" containerName="container-00" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.149413 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.152473 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kvshw"/"default-dockercfg-k8wvs" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.226696 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lff\" (UniqueName: \"kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.226805 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.328495 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lff\" (UniqueName: \"kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.328606 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.328809 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.347914 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lff\" (UniqueName: \"kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff\") pod \"crc-debug-gd6rn\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:00 crc kubenswrapper[4760]: I1204 13:52:00.468513 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.376011 4760 generic.go:334] "Generic (PLEG): container finished" podID="cd85ec41-5ddd-4337-a424-dea0528258bf" containerID="f3ff1d50424d706dc47bee2389dc746da2af6d387a5f9c42971b60f7fcb41a18" exitCode=0 Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.376103 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" event={"ID":"cd85ec41-5ddd-4337-a424-dea0528258bf","Type":"ContainerDied","Data":"f3ff1d50424d706dc47bee2389dc746da2af6d387a5f9c42971b60f7fcb41a18"} Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.376580 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" event={"ID":"cd85ec41-5ddd-4337-a424-dea0528258bf","Type":"ContainerStarted","Data":"a79f4442dd78c7c400ca58b0d6eaf24977d70f044ff62d419f35e8168aa02c2e"} Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.420505 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-gd6rn"] Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.431262 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvshw/crc-debug-gd6rn"] Dec 04 13:52:01 crc kubenswrapper[4760]: I1204 13:52:01.864998 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:52:01 crc kubenswrapper[4760]: E1204 13:52:01.865599 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.508810 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.578279 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host\") pod \"cd85ec41-5ddd-4337-a424-dea0528258bf\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.578561 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9lff\" (UniqueName: \"kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff\") pod \"cd85ec41-5ddd-4337-a424-dea0528258bf\" (UID: \"cd85ec41-5ddd-4337-a424-dea0528258bf\") " Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.578836 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host" (OuterVolumeSpecName: "host") pod "cd85ec41-5ddd-4337-a424-dea0528258bf" (UID: "cd85ec41-5ddd-4337-a424-dea0528258bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.579781 4760 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd85ec41-5ddd-4337-a424-dea0528258bf-host\") on node \"crc\" DevicePath \"\"" Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.592162 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff" (OuterVolumeSpecName: "kube-api-access-h9lff") pod "cd85ec41-5ddd-4337-a424-dea0528258bf" (UID: "cd85ec41-5ddd-4337-a424-dea0528258bf"). InnerVolumeSpecName "kube-api-access-h9lff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:52:02 crc kubenswrapper[4760]: I1204 13:52:02.682035 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9lff\" (UniqueName: \"kubernetes.io/projected/cd85ec41-5ddd-4337-a424-dea0528258bf-kube-api-access-h9lff\") on node \"crc\" DevicePath \"\"" Dec 04 13:52:03 crc kubenswrapper[4760]: I1204 13:52:03.395947 4760 scope.go:117] "RemoveContainer" containerID="f3ff1d50424d706dc47bee2389dc746da2af6d387a5f9c42971b60f7fcb41a18" Dec 04 13:52:03 crc kubenswrapper[4760]: I1204 13:52:03.395944 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/crc-debug-gd6rn" Dec 04 13:52:03 crc kubenswrapper[4760]: I1204 13:52:03.874869 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd85ec41-5ddd-4337-a424-dea0528258bf" path="/var/lib/kubelet/pods/cd85ec41-5ddd-4337-a424-dea0528258bf/volumes" Dec 04 13:52:14 crc kubenswrapper[4760]: I1204 13:52:14.864484 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:52:14 crc kubenswrapper[4760]: E1204 13:52:14.865198 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:52:23 crc kubenswrapper[4760]: I1204 13:52:23.586577 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-658b6c7fb4-vh8cp_ac9e67e5-eea3-4608-bd30-8483225d28d2/barbican-api/0.log" Dec 04 13:52:23 crc kubenswrapper[4760]: I1204 13:52:23.796400 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-658b6c7fb4-vh8cp_ac9e67e5-eea3-4608-bd30-8483225d28d2/barbican-api-log/0.log" Dec 04 13:52:23 crc kubenswrapper[4760]: I1204 13:52:23.800846 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-795464b486-tr8rf_17dd60e5-2f5d-4f7d-b694-9fa3245dc207/barbican-keystone-listener/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.050330 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-577684486f-vqq72_e6ee4654-5dd5-4c14-9985-1037a884e4b7/barbican-worker/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.119665 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-577684486f-vqq72_e6ee4654-5dd5-4c14-9985-1037a884e4b7/barbican-worker-log/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.360411 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-cq52r_fd552df6-e07d-4042-b4d7-8b154163e633/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.704417 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/ceilometer-notification-agent/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.712034 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/ceilometer-central-agent/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.719440 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-795464b486-tr8rf_17dd60e5-2f5d-4f7d-b694-9fa3245dc207/barbican-keystone-listener-log/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.760818 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/proxy-httpd/0.log" Dec 04 13:52:24 crc kubenswrapper[4760]: I1204 13:52:24.874435 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b898fcbf-2997-40cf-b167-8875a2763092/sg-core/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.056622 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_ffee0fcf-4f0c-4471-8b39-1762da661157/ceph/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.371563 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd/cinder-api-log/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.460948 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e5e90ab-3e7d-4759-9d7f-56e4837fe9dd/cinder-api/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.612148 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_21d77c4c-3493-44c4-b194-6d9dd912d5a1/probe/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.772839 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5bf94526-eda5-4784-a223-e0ff51ec09e8/cinder-scheduler/0.log" Dec 04 13:52:25 crc kubenswrapper[4760]: I1204 13:52:25.913772 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5bf94526-eda5-4784-a223-e0ff51ec09e8/probe/0.log" Dec 04 13:52:26 crc kubenswrapper[4760]: I1204 13:52:26.193435 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e6f33c-9843-4810-9cec-5b7b7525d759/probe/0.log" Dec 04 13:52:26 crc kubenswrapper[4760]: I1204 13:52:26.447551 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9x22f_7d193294-81b1-457c-99d0-9701df78978b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:26 crc kubenswrapper[4760]: I1204 13:52:26.666915 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5vbh6_c5a0aee6-7728-4bcf-8361-93bc45069c7f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:26 crc kubenswrapper[4760]: I1204 13:52:26.864167 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:52:26 crc kubenswrapper[4760]: E1204 13:52:26.864678 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:52:26 crc kubenswrapper[4760]: I1204 13:52:26.898511 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/init/0.log" Dec 04 13:52:27 crc kubenswrapper[4760]: I1204 13:52:27.102628 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/init/0.log" Dec 04 13:52:27 crc kubenswrapper[4760]: I1204 13:52:27.375546 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d99fc9df9-g4fqd_19e80d45-0318-4d8f-8567-e3aef4734081/dnsmasq-dns/0.log" Dec 04 13:52:27 crc kubenswrapper[4760]: I1204 13:52:27.551078 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-89lhj_831952cf-f2b0-482f-bd5e-69dcf19821f9/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:27 crc kubenswrapper[4760]: I1204 13:52:27.710117 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fa001053-23fb-4a80-8f36-8efc97cdc04d/glance-httpd/0.log" Dec 04 13:52:27 crc kubenswrapper[4760]: I1204 13:52:27.885604 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fa001053-23fb-4a80-8f36-8efc97cdc04d/glance-log/0.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.148708 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_034481e8-9000-4642-8f09-01e015db2de2/glance-httpd/0.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.157177 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_034481e8-9000-4642-8f09-01e015db2de2/glance-log/0.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.483827 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon/2.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.517865 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_21d77c4c-3493-44c4-b194-6d9dd912d5a1/cinder-backup/0.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.557798 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon/1.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.756468 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6z22t_d0e2d2b4-b155-4d0c-bfd4-7767bbda0b27/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:28 crc kubenswrapper[4760]: I1204 13:52:28.994951 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-b7gjm_38e0514a-720e-4407-9e18-9fff5e901aab/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:29 crc kubenswrapper[4760]: I1204 13:52:29.043697 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414221-d4ff5_a5b9d885-9739-498a-bd0e-fd78e0d5c779/keystone-cron/0.log" Dec 04 13:52:29 crc kubenswrapper[4760]: I1204 13:52:29.358874 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7fc6c944-sh7tv_a6452e5d-5eb7-4d21-96ea-eefbc327f2f5/horizon-log/0.log" Dec 04 13:52:29 crc kubenswrapper[4760]: I1204 13:52:29.491773 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2cded8b4-450e-4646-b9c4-9df7334f5532/kube-state-metrics/0.log" Dec 04 13:52:29 crc kubenswrapper[4760]: I1204 13:52:29.555602 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_82e6f33c-9843-4810-9cec-5b7b7525d759/cinder-volume/0.log" Dec 04 13:52:29 crc kubenswrapper[4760]: I1204 13:52:29.712074 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2cb7x_e2dcfb70-5791-401e-a7d3-cec6bf1f4dba/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:30 crc kubenswrapper[4760]: I1204 13:52:30.209413 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da99ecae-3ef9-484d-a420-0317df7654d5/probe/0.log" Dec 04 13:52:30 crc kubenswrapper[4760]: I1204 13:52:30.526481 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da99ecae-3ef9-484d-a420-0317df7654d5/manila-scheduler/0.log" Dec 04 13:52:30 crc kubenswrapper[4760]: I1204 13:52:30.605402 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_de257688-7d38-4795-b8a6-36b58bdbc2b8/manila-api/0.log" Dec 04 13:52:30 crc kubenswrapper[4760]: I1204 13:52:30.820177 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_546d61e1-ffcb-48a3-8dae-929470ae8372/probe/0.log" Dec 04 13:52:31 crc kubenswrapper[4760]: I1204 13:52:31.121504 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_de257688-7d38-4795-b8a6-36b58bdbc2b8/manila-api-log/0.log" Dec 04 13:52:31 crc kubenswrapper[4760]: I1204 13:52:31.222932 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_546d61e1-ffcb-48a3-8dae-929470ae8372/manila-share/0.log" Dec 04 13:52:31 crc kubenswrapper[4760]: I1204 13:52:31.726281 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-tl6m6_4e5c739f-fcc7-4384-b7e0-302daee90091/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:32 crc kubenswrapper[4760]: I1204 13:52:32.129377 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869c8d7d5c-srd5v_c05ef76a-b809-4d3d-972c-2e5d2037b806/neutron-httpd/0.log" Dec 04 13:52:32 crc kubenswrapper[4760]: I1204 13:52:32.867633 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869c8d7d5c-srd5v_c05ef76a-b809-4d3d-972c-2e5d2037b806/neutron-api/0.log" Dec 04 13:52:33 crc kubenswrapper[4760]: I1204 13:52:33.129804 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_40657d96-3f5d-4da0-9783-845c41bfeaae/memcached/0.log" Dec 04 13:52:33 crc kubenswrapper[4760]: I1204 13:52:33.864629 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9f10a840-d1c4-4d92-bb37-5abe342cb4d1/nova-cell0-conductor-conductor/0.log" Dec 04 13:52:33 crc kubenswrapper[4760]: I1204 13:52:33.874191 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4e8ed881-2d3f-4939-b065-5b6860ad523d/nova-cell1-conductor-conductor/0.log" Dec 04 13:52:33 crc kubenswrapper[4760]: I1204 13:52:33.953554 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d74ff5d57-x29r6_da835318-50f2-43af-9988-bad83a5ee42c/keystone-api/0.log" Dec 04 13:52:34 crc kubenswrapper[4760]: I1204 13:52:34.266944 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0074865f-e381-4090-a72e-54eec164814e/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 13:52:34 crc kubenswrapper[4760]: I1204 13:52:34.442567 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fx275_b8bba20c-b75b-40da-98ad-436a4d121d13/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:34 crc kubenswrapper[4760]: I1204 13:52:34.602360 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d7631823-c503-49fd-85ac-ec4b8bc18a5b/nova-metadata-log/0.log" Dec 04 13:52:34 crc kubenswrapper[4760]: I1204 13:52:34.620400 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3f4c620c-1c1a-4aa3-aa92-5df1a205e70d/nova-api-log/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.076903 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/mysql-bootstrap/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.259630 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/mysql-bootstrap/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.321141 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_db4dbf6c-c9d0-4f9c-889d-a453bb2da6ff/galera/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.409561 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_deac79a1-f615-4bba-b00f-11784b824094/nova-scheduler-scheduler/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.576947 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3f4c620c-1c1a-4aa3-aa92-5df1a205e70d/nova-api-api/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.578800 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/mysql-bootstrap/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.787231 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_edbb46fc-c8ec-45c9-bdb2-36639d92402e/openstackclient/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.861113 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/mysql-bootstrap/0.log" Dec 04 13:52:35 crc kubenswrapper[4760]: I1204 13:52:35.946007 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9649c38c-ebc4-4103-aa55-c2aa867d6e26/galera/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.010011 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-frcmm_52334746-99b6-4056-a7d7-6df95b72d8de/ovn-controller/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.181148 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xwlm9_a4174c36-258e-4ed9-b6a7-f52818d3faed/openstack-network-exporter/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.339092 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server-init/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.555874 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server-init/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.597669 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovs-vswitchd/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.637506 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d7631823-c503-49fd-85ac-ec4b8bc18a5b/nova-metadata-metadata/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.655825 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-76hxp_7198d8b5-1a9e-45e7-8151-922d62c1e1f0/ovsdb-server/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.765891 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fg4xt_aec90918-8692-4e3d-ba94-7b8e358b8f60/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.848866 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa2846c9-8951-497c-bcae-d186f8f62265/openstack-network-exporter/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.877747 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fa2846c9-8951-497c-bcae-d186f8f62265/ovn-northd/0.log" Dec 04 13:52:36 crc kubenswrapper[4760]: I1204 13:52:36.995842 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_591b6997-8c15-499c-8218-e222a178559e/openstack-network-exporter/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.033450 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_591b6997-8c15-499c-8218-e222a178559e/ovsdbserver-nb/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.079908 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b055fc8b-2181-441b-b4b3-efa345cfde65/openstack-network-exporter/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.164505 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b055fc8b-2181-441b-b4b3-efa345cfde65/ovsdbserver-sb/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.442169 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/setup-container/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.658020 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6999bbbcbb-mwnkr_72a9e917-ec75-4b75-a7db-ca42c3e8d1f5/placement-api/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.660867 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/setup-container/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.726995 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6999bbbcbb-mwnkr_72a9e917-ec75-4b75-a7db-ca42c3e8d1f5/placement-log/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.777267 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_482cdb14-c28c-44e0-8054-a5e782a71b54/rabbitmq/0.log" Dec 04 13:52:37 crc kubenswrapper[4760]: I1204 13:52:37.855848 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/setup-container/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.015959 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/setup-container/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.168959 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-7b9sd_f8e97dd8-8609-4469-a5f8-488c6b3a2098/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.176348 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_50565ce8-ee16-43b2-af07-c92e7444546c/rabbitmq/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.281466 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pc5cw_4ba039cb-b160-4ec8-9f00-a42e7bcce289/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.363519 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-rxrmx_51e288c0-c373-4aa9-9c38-cb94fbeccf01/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.484322 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wj92m_264546ed-074b-4824-8da7-d711ccc821c5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.601236 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8mgzh_b297e018-4fcb-40a4-b5f5-3105c4300ae7/ssh-known-hosts-edpm-deployment/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.771232 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f6947989-zvg6b_8b0dacfe-716a-44d3-a653-88fc5183ae97/proxy-server/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.854896 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m4x4d_4ce3174d-015c-4a85-b58d-af7603479902/swift-ring-rebalance/0.log" Dec 04 13:52:38 crc kubenswrapper[4760]: I1204 13:52:38.862075 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f6947989-zvg6b_8b0dacfe-716a-44d3-a653-88fc5183ae97/proxy-httpd/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.029294 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-auditor/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.096122 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-reaper/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.107781 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-server/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.123063 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/account-replicator/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.182076 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-auditor/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.256699 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-replicator/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.312762 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-updater/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.315377 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/container-server/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.345636 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-auditor/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.409523 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-expirer/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.487507 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-replicator/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.523637 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-server/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.566291 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/object-updater/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.587886 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/rsync/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.645113 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ad805440-7d12-4b3e-b11b-c37463e95bb7/swift-recon-cron/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.775092 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5jrwc_6b99a8e4-6932-4867-b485-872dfefcf4fc/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.862857 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ef91667b-5e29-49a0-9de9-d557462e96c0/tempest-tests-tempest-tests-runner/0.log" Dec 04 13:52:39 crc kubenswrapper[4760]: I1204 13:52:39.967956 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2526d1e6-dd17-4093-92b2-1bee2a207bac/test-operator-logs-container/0.log" Dec 04 13:52:40 crc kubenswrapper[4760]: I1204 13:52:40.054732 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-59m8w_85fd1b45-21c2-4541-bac9-ce63eddbc242/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 13:52:41 crc kubenswrapper[4760]: I1204 13:52:41.865330 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:52:41 crc kubenswrapper[4760]: E1204 13:52:41.865887 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:52:52 crc kubenswrapper[4760]: I1204 13:52:52.864846 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:52:52 crc kubenswrapper[4760]: E1204 13:52:52.865570 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.487688 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.655750 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.730097 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.740254 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.870650 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/pull/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.870687 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/util/0.log" Dec 04 13:53:02 crc kubenswrapper[4760]: I1204 13:53:02.917565 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4060e2e1caa7cc3eb4159c86e6d430010a5f26beb806bddeb01c9b06ffxfpdh_e5ca5bd9-ed9a-4568-8b09-57e66e9ad187/extract/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.040274 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kzkts_69f6297b-8cdc-4bfc-ba61-1868e7805998/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.137999 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-kzkts_69f6297b-8cdc-4bfc-ba61-1868e7805998/manager/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.169352 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgdz6_2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.351124 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgdz6_2eded9d6-9cb7-46d5-8a12-ffa44dbc6fcd/manager/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.379895 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-42kkv_9093443e-058e-41f0-81ea-9ff8ba566d8a/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.419510 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-42kkv_9093443e-058e-41f0-81ea-9ff8ba566d8a/manager/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.566508 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8c7dd_e8a9a9f4-8e40-4506-9aeb-c3e83d62de39/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.686379 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-8c7dd_e8a9a9f4-8e40-4506-9aeb-c3e83d62de39/manager/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.764276 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfqvw_abc24b8c-0be3-44c7-b011-5ea10803fdf1/manager/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.795491 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfqvw_abc24b8c-0be3-44c7-b011-5ea10803fdf1/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.885604 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-5jtgd_4018757f-a398-4734-9a4e-b6cc11327b9f/kube-rbac-proxy/0.log" Dec 04 13:53:03 crc kubenswrapper[4760]: I1204 13:53:03.966483 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-5jtgd_4018757f-a398-4734-9a4e-b6cc11327b9f/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.029965 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-2ffsj_73caa66c-d120-4b70-b417-d7f363ce6236/kube-rbac-proxy/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.200042 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-thpz6_ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a/kube-rbac-proxy/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.266875 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-2ffsj_73caa66c-d120-4b70-b417-d7f363ce6236/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.283949 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-thpz6_ca3c9d7d-086a-4b0a-bf4a-f5381c283f0a/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.436257 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4866m_35ad36be-f7a9-4ca8-bd29-0d5ccd658c53/kube-rbac-proxy/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.572511 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-4866m_35ad36be-f7a9-4ca8-bd29-0d5ccd658c53/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.670428 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-g64p8_b7918a8b-7f47-4d71-820b-95156b273357/kube-rbac-proxy/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.714294 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-g64p8_b7918a8b-7f47-4d71-820b-95156b273357/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.787597 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m9phl_546bf075-78c1-4324-ba9a-80ac8df0c4f7/kube-rbac-proxy/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.864585 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:53:04 crc kubenswrapper[4760]: E1204 13:53:04.864844 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.884769 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-m9phl_546bf075-78c1-4324-ba9a-80ac8df0c4f7/manager/0.log" Dec 04 13:53:04 crc kubenswrapper[4760]: I1204 13:53:04.960832 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x5zgc_282a54d8-5318-49e0-aefe-a86a7a8d63ac/kube-rbac-proxy/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.059917 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x5zgc_282a54d8-5318-49e0-aefe-a86a7a8d63ac/manager/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.150350 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-57pzb_a01a242a-291b-4281-a331-91c05efcdf87/kube-rbac-proxy/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.223626 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-57pzb_a01a242a-291b-4281-a331-91c05efcdf87/manager/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.366075 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-slr7f_f13ef420-321e-40f8-90d2-e6fdcbb72752/kube-rbac-proxy/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.386630 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-slr7f_f13ef420-321e-40f8-90d2-e6fdcbb72752/manager/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.520754 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7_ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9/kube-rbac-proxy/0.log" Dec 04 13:53:05 crc kubenswrapper[4760]: I1204 13:53:05.571127 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4vxhj7_ef4473bb-6ed2-4f7c-96f0-e2998ff6fbc9/manager/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.083717 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-645cc8bbd-8pfbz_91a1898b-cdb0-4f97-9bc0-242d1980bd8c/operator/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.176408 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k2f7j_a4e176f0-09d3-4710-a8a7-32cd09f03c4d/registry-server/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.329069 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-zxmkv_97874450-ee94-4963-aa10-a58295edae62/kube-rbac-proxy/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.386584 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-zxmkv_97874450-ee94-4963-aa10-a58295edae62/manager/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.485730 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ss2gd_21a55b1d-ebff-4abd-a556-d272a1753a5b/kube-rbac-proxy/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.595737 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ss2gd_21a55b1d-ebff-4abd-a556-d272a1753a5b/manager/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.724086 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pptq9_35d340c4-abab-4dc8-8ba4-e8740d6b89d4/operator/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.869834 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-q9nzq_fcf368e1-183d-445d-b3b7-dfd4f08fddcd/kube-rbac-proxy/0.log" Dec 04 13:53:06 crc kubenswrapper[4760]: I1204 13:53:06.941191 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-q9nzq_fcf368e1-183d-445d-b3b7-dfd4f08fddcd/manager/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.018142 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-77fb648ff9-cnn8v_bfa893f7-8101-4fd1-ae93-94688b827e95/manager/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.030653 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-twx55_a96cd5b4-6668-4815-b121-777fe0e65833/kube-rbac-proxy/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.168837 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-twx55_a96cd5b4-6668-4815-b121-777fe0e65833/manager/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.223027 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g22dq_750e95e1-6019-4779-a2c0-4abcce4b1c8c/kube-rbac-proxy/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.247638 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-g22dq_750e95e1-6019-4779-a2c0-4abcce4b1c8c/manager/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.399179 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wwqps_7c8092df-4a88-4a8c-a400-6435f525a5ec/kube-rbac-proxy/0.log" Dec 04 13:53:07 crc kubenswrapper[4760]: I1204 13:53:07.420826 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wwqps_7c8092df-4a88-4a8c-a400-6435f525a5ec/manager/0.log" Dec 04 13:53:15 crc kubenswrapper[4760]: I1204 13:53:15.864426 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:53:15 crc kubenswrapper[4760]: E1204 13:53:15.865113 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:25 crc kubenswrapper[4760]: I1204 13:53:25.666489 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nwwhw_fdbd7bc3-cca1-4368-814a-126ba13a4f8e/control-plane-machine-set-operator/0.log" Dec 04 13:53:25 crc kubenswrapper[4760]: I1204 13:53:25.890611 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghltq_a6b7fabb-b00b-41d3-9a63-291959a7c157/kube-rbac-proxy/0.log" Dec 04 13:53:25 crc kubenswrapper[4760]: I1204 13:53:25.896846 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghltq_a6b7fabb-b00b-41d3-9a63-291959a7c157/machine-api-operator/0.log" Dec 04 13:53:26 crc kubenswrapper[4760]: I1204 13:53:26.864747 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:53:26 crc kubenswrapper[4760]: E1204 13:53:26.865372 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.334850 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-z26nl_7ec7a22b-07c7-4ea7-b80a-cb9003ef2fcc/cert-manager-controller/0.log" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.477609 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q62nq_e6ca39a1-4f59-4e58-85a9-eb60075647a8/cert-manager-cainjector/0.log" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.583347 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gs5vd_6e691fa7-b524-4703-9ba2-9b5d2936deef/cert-manager-webhook/0.log" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.638790 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:39 crc kubenswrapper[4760]: E1204 13:53:39.639517 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd85ec41-5ddd-4337-a424-dea0528258bf" containerName="container-00" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.639536 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd85ec41-5ddd-4337-a424-dea0528258bf" containerName="container-00" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.639842 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd85ec41-5ddd-4337-a424-dea0528258bf" containerName="container-00" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.641742 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.654808 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.700372 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.700535 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.700583 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2d9k\" (UniqueName: \"kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.802757 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.802869 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.802908 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2d9k\" (UniqueName: \"kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.803237 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.803647 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.834713 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2d9k\" (UniqueName: \"kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k\") pod \"certified-operators-fvtnp\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:39 crc kubenswrapper[4760]: I1204 13:53:39.975713 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:40 crc kubenswrapper[4760]: I1204 13:53:40.753045 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:41 crc kubenswrapper[4760]: I1204 13:53:41.353309 4760 generic.go:334] "Generic (PLEG): container finished" podID="09376db1-1bff-4d75-b28c-f52274c57f05" containerID="7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451" exitCode=0 Dec 04 13:53:41 crc kubenswrapper[4760]: I1204 13:53:41.353418 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerDied","Data":"7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451"} Dec 04 13:53:41 crc kubenswrapper[4760]: I1204 13:53:41.353884 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerStarted","Data":"e3d9acf6bcc248a57df86ddb27fb64a2c8c8a0acc11532190ee07e61f6fa4a97"} Dec 04 13:53:41 crc kubenswrapper[4760]: I1204 13:53:41.356886 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:53:41 crc kubenswrapper[4760]: I1204 13:53:41.864282 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:53:41 crc kubenswrapper[4760]: E1204 13:53:41.864558 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:42 crc kubenswrapper[4760]: I1204 13:53:42.365098 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerStarted","Data":"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565"} Dec 04 13:53:44 crc kubenswrapper[4760]: I1204 13:53:44.391885 4760 generic.go:334] "Generic (PLEG): container finished" podID="09376db1-1bff-4d75-b28c-f52274c57f05" containerID="b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565" exitCode=0 Dec 04 13:53:44 crc kubenswrapper[4760]: I1204 13:53:44.391955 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerDied","Data":"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565"} Dec 04 13:53:45 crc kubenswrapper[4760]: I1204 13:53:45.404272 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerStarted","Data":"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c"} Dec 04 13:53:45 crc kubenswrapper[4760]: I1204 13:53:45.435270 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fvtnp" podStartSLOduration=2.682896378 podStartE2EDuration="6.435235717s" podCreationTimestamp="2025-12-04 13:53:39 +0000 UTC" firstStartedPulling="2025-12-04 13:53:41.356573037 +0000 UTC m=+6024.398019604" lastFinishedPulling="2025-12-04 13:53:45.108912376 +0000 UTC m=+6028.150358943" observedRunningTime="2025-12-04 13:53:45.43090194 +0000 UTC m=+6028.472348507" watchObservedRunningTime="2025-12-04 13:53:45.435235717 +0000 UTC m=+6028.476682284" Dec 04 13:53:49 crc kubenswrapper[4760]: I1204 13:53:49.976820 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:49 crc kubenswrapper[4760]: I1204 13:53:49.977889 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:50 crc kubenswrapper[4760]: I1204 13:53:50.026456 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:50 crc kubenswrapper[4760]: I1204 13:53:50.501596 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:50 crc kubenswrapper[4760]: I1204 13:53:50.828515 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:52 crc kubenswrapper[4760]: I1204 13:53:52.468515 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fvtnp" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="registry-server" containerID="cri-o://f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c" gracePeriod=2 Dec 04 13:53:52 crc kubenswrapper[4760]: I1204 13:53:52.864428 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:53:52 crc kubenswrapper[4760]: E1204 13:53:52.865140 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.008317 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.160772 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2d9k\" (UniqueName: \"kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k\") pod \"09376db1-1bff-4d75-b28c-f52274c57f05\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.160940 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities\") pod \"09376db1-1bff-4d75-b28c-f52274c57f05\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.161118 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content\") pod \"09376db1-1bff-4d75-b28c-f52274c57f05\" (UID: \"09376db1-1bff-4d75-b28c-f52274c57f05\") " Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.162973 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities" (OuterVolumeSpecName: "utilities") pod "09376db1-1bff-4d75-b28c-f52274c57f05" (UID: "09376db1-1bff-4d75-b28c-f52274c57f05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.174577 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k" (OuterVolumeSpecName: "kube-api-access-g2d9k") pod "09376db1-1bff-4d75-b28c-f52274c57f05" (UID: "09376db1-1bff-4d75-b28c-f52274c57f05"). InnerVolumeSpecName "kube-api-access-g2d9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.212993 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09376db1-1bff-4d75-b28c-f52274c57f05" (UID: "09376db1-1bff-4d75-b28c-f52274c57f05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.264836 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.265072 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2d9k\" (UniqueName: \"kubernetes.io/projected/09376db1-1bff-4d75-b28c-f52274c57f05-kube-api-access-g2d9k\") on node \"crc\" DevicePath \"\"" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.265088 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09376db1-1bff-4d75-b28c-f52274c57f05-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.477686 4760 generic.go:334] "Generic (PLEG): container finished" podID="09376db1-1bff-4d75-b28c-f52274c57f05" containerID="f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c" exitCode=0 Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.477734 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerDied","Data":"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c"} Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.477760 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fvtnp" event={"ID":"09376db1-1bff-4d75-b28c-f52274c57f05","Type":"ContainerDied","Data":"e3d9acf6bcc248a57df86ddb27fb64a2c8c8a0acc11532190ee07e61f6fa4a97"} Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.477777 4760 scope.go:117] "RemoveContainer" containerID="f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.477927 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fvtnp" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.510510 4760 scope.go:117] "RemoveContainer" containerID="b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.527414 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.534024 4760 scope.go:117] "RemoveContainer" containerID="7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.535812 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fvtnp"] Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.593432 4760 scope.go:117] "RemoveContainer" containerID="f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c" Dec 04 13:53:53 crc kubenswrapper[4760]: E1204 13:53:53.594179 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c\": container with ID starting with f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c not found: ID does not exist" containerID="f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.594251 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c"} err="failed to get container status \"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c\": rpc error: code = NotFound desc = could not find container \"f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c\": container with ID starting with f13180f14267fc3c573a1f0b59e2201ca3879a5e91b94841222ac33d6b69313c not found: ID does not exist" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.594284 4760 scope.go:117] "RemoveContainer" containerID="b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565" Dec 04 13:53:53 crc kubenswrapper[4760]: E1204 13:53:53.594719 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565\": container with ID starting with b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565 not found: ID does not exist" containerID="b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.594750 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565"} err="failed to get container status \"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565\": rpc error: code = NotFound desc = could not find container \"b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565\": container with ID starting with b67db2cceac13f1e94f7d13d8995fd26cf352663832ddcb394ec40c826b4a565 not found: ID does not exist" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.594773 4760 scope.go:117] "RemoveContainer" containerID="7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451" Dec 04 13:53:53 crc kubenswrapper[4760]: E1204 13:53:53.595060 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451\": container with ID starting with 7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451 not found: ID does not exist" containerID="7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.595081 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451"} err="failed to get container status \"7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451\": rpc error: code = NotFound desc = could not find container \"7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451\": container with ID starting with 7e2865355adfacbc1b71fbe3e6f75c3cc16c8f1fa66e55b4e69071777362c451 not found: ID does not exist" Dec 04 13:53:53 crc kubenswrapper[4760]: I1204 13:53:53.900627 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" path="/var/lib/kubelet/pods/09376db1-1bff-4d75-b28c-f52274c57f05/volumes" Dec 04 13:53:56 crc kubenswrapper[4760]: I1204 13:53:56.618329 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-6hrpm_ac746240-d1e4-4a04-98f1-b22871ca58e4/nmstate-console-plugin/0.log" Dec 04 13:53:56 crc kubenswrapper[4760]: I1204 13:53:56.824550 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wvpfn_491432c2-b909-4092-a693-409b65208f85/nmstate-handler/0.log" Dec 04 13:53:56 crc kubenswrapper[4760]: I1204 13:53:56.849855 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6wct8_e4f6d4b1-9f69-4970-a2d0-141049cbee82/kube-rbac-proxy/0.log" Dec 04 13:53:56 crc kubenswrapper[4760]: I1204 13:53:56.887578 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6wct8_e4f6d4b1-9f69-4970-a2d0-141049cbee82/nmstate-metrics/0.log" Dec 04 13:53:57 crc kubenswrapper[4760]: I1204 13:53:57.095558 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-lmqv2_e9bf818f-d737-4114-a5c3-003834179d27/nmstate-webhook/0.log" Dec 04 13:53:57 crc kubenswrapper[4760]: I1204 13:53:57.110797 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-vv8cz_a753604d-ead8-4550-be02-3a5ae4827390/nmstate-operator/0.log" Dec 04 13:54:03 crc kubenswrapper[4760]: I1204 13:54:03.864629 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:54:03 crc kubenswrapper[4760]: E1204 13:54:03.865411 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:54:12 crc kubenswrapper[4760]: I1204 13:54:12.588421 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9vpvj_66980e43-ab0c-4f0e-a66b-1ba0047809d2/kube-rbac-proxy/0.log" Dec 04 13:54:12 crc kubenswrapper[4760]: I1204 13:54:12.589318 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-9vpvj_66980e43-ab0c-4f0e-a66b-1ba0047809d2/controller/0.log" Dec 04 13:54:12 crc kubenswrapper[4760]: I1204 13:54:12.770707 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:54:12 crc kubenswrapper[4760]: I1204 13:54:12.996963 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.008101 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.038409 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.064602 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.242429 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.277464 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.286236 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.329083 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.521449 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-metrics/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.527892 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-reloader/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.527892 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/cp-frr-files/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.574825 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/controller/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.750921 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/frr-metrics/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.761717 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/kube-rbac-proxy/0.log" Dec 04 13:54:13 crc kubenswrapper[4760]: I1204 13:54:13.805069 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/kube-rbac-proxy-frr/0.log" Dec 04 13:54:14 crc kubenswrapper[4760]: I1204 13:54:14.146676 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/reloader/0.log" Dec 04 13:54:14 crc kubenswrapper[4760]: I1204 13:54:14.276685 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5pb2g_49e9ed6d-c8f2-4aaf-ab8a-95e018dddbae/frr-k8s-webhook-server/0.log" Dec 04 13:54:14 crc kubenswrapper[4760]: I1204 13:54:14.483495 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-645c4f57b7-84fcv_f4735108-14df-4389-af8f-d3e7c56eba8f/manager/0.log" Dec 04 13:54:14 crc kubenswrapper[4760]: I1204 13:54:14.618187 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7578d999c8-hsqg8_ce6e08a4-5fa8-42c9-929d-94af09b81ec2/webhook-server/0.log" Dec 04 13:54:14 crc kubenswrapper[4760]: I1204 13:54:14.787221 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rxp6m_986d9828-3e9c-4b9a-bdc5-aaa3eb184641/kube-rbac-proxy/0.log" Dec 04 13:54:15 crc kubenswrapper[4760]: I1204 13:54:15.545379 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-rxp6m_986d9828-3e9c-4b9a-bdc5-aaa3eb184641/speaker/0.log" Dec 04 13:54:15 crc kubenswrapper[4760]: I1204 13:54:15.571140 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n4b8p_c29571e9-5b28-43ef-a429-9af2daa6f4bc/frr/0.log" Dec 04 13:54:18 crc kubenswrapper[4760]: I1204 13:54:18.864835 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:54:18 crc kubenswrapper[4760]: E1204 13:54:18.865711 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:54:28 crc kubenswrapper[4760]: I1204 13:54:28.785115 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:54:28 crc kubenswrapper[4760]: I1204 13:54:28.968575 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.026389 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.055449 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.183717 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/util/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.187285 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/extract/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.218855 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffpzp5_1cc8460c-742b-4533-a26f-225de9c85310/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.350181 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.540951 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.542013 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.619801 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.776528 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/util/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.793653 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/pull/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.830865 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83c5h7q_e3a7c8db-106b-48f4-a044-e604a1c6f934/extract/0.log" Dec 04 13:54:29 crc kubenswrapper[4760]: I1204 13:54:29.966470 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.226683 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.243779 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.250063 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.411912 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-utilities/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.428392 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/extract-content/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.630538 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.891394 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:54:30 crc kubenswrapper[4760]: I1204 13:54:30.977322 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.017626 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.120366 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6p954_d72c0613-6684-4d59-9968-065130b7b861/registry-server/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.157775 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-utilities/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.268018 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/extract-content/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.423566 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xj92j_df130580-94d3-40cd-a840-c85281e78fcc/marketplace-operator/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.673318 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.840774 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.865321 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:54:31 crc kubenswrapper[4760]: E1204 13:54:31.865627 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.877361 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:54:31 crc kubenswrapper[4760]: I1204 13:54:31.897337 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.061733 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sqqhg_b4c2d28f-e4ef-41bc-8769-83eb39cf2569/registry-server/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.160045 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-utilities/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.372809 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/extract-content/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.540489 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h9cp7_43708b75-fa1d-4306-90e2-5d057baed057/registry-server/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.578835 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.759739 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.764983 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.772076 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.957151 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-content/0.log" Dec 04 13:54:32 crc kubenswrapper[4760]: I1204 13:54:32.981286 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/extract-utilities/0.log" Dec 04 13:54:33 crc kubenswrapper[4760]: I1204 13:54:33.732850 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rkrwq_40139a83-284c-4524-90c5-d20d77d6c286/registry-server/0.log" Dec 04 13:54:42 crc kubenswrapper[4760]: I1204 13:54:42.864643 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:54:42 crc kubenswrapper[4760]: E1204 13:54:42.865465 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:54:53 crc kubenswrapper[4760]: I1204 13:54:53.866671 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:54:53 crc kubenswrapper[4760]: E1204 13:54:53.867328 4760 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnrr9_openshift-machine-config-operator(65f76314-9511-40ed-9ad6-2220378e7e97)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" Dec 04 13:55:05 crc kubenswrapper[4760]: I1204 13:55:05.864739 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:55:06 crc kubenswrapper[4760]: I1204 13:55:06.194148 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"ef8b160a33e628bf2004e78ec3dece4bed2fad4d300aac3f9ee82eecf9cde264"} Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.299724 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:16 crc kubenswrapper[4760]: E1204 13:55:16.300792 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="registry-server" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.300806 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="registry-server" Dec 04 13:55:16 crc kubenswrapper[4760]: E1204 13:55:16.300822 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="extract-content" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.300827 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="extract-content" Dec 04 13:55:16 crc kubenswrapper[4760]: E1204 13:55:16.300850 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="extract-utilities" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.300857 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="extract-utilities" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.301062 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="09376db1-1bff-4d75-b28c-f52274c57f05" containerName="registry-server" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.302690 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.315933 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.485886 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.486155 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntd7f\" (UniqueName: \"kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.486351 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.587690 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.587809 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.587931 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntd7f\" (UniqueName: \"kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.588279 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.588297 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.615543 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntd7f\" (UniqueName: \"kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f\") pod \"community-operators-sn2hl\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:16 crc kubenswrapper[4760]: I1204 13:55:16.622355 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:17 crc kubenswrapper[4760]: I1204 13:55:17.222758 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:17 crc kubenswrapper[4760]: I1204 13:55:17.405237 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerStarted","Data":"8ed9cf95e671e664448ee4579ac7d112ce27a21453637ee58fbefae568a613cc"} Dec 04 13:55:18 crc kubenswrapper[4760]: I1204 13:55:18.417275 4760 generic.go:334] "Generic (PLEG): container finished" podID="091508c8-69c0-4b18-bfd6-677625366ada" containerID="2df40bdfcf74b77c658819beddccb6da5809126ac9ecb63f835ac4c23d0e3f5f" exitCode=0 Dec 04 13:55:18 crc kubenswrapper[4760]: I1204 13:55:18.418467 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerDied","Data":"2df40bdfcf74b77c658819beddccb6da5809126ac9ecb63f835ac4c23d0e3f5f"} Dec 04 13:55:19 crc kubenswrapper[4760]: I1204 13:55:19.430124 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerStarted","Data":"229104418cbc63dd96da02b7962c84bba5c3f23220cac541ff693bc1c92d229d"} Dec 04 13:55:20 crc kubenswrapper[4760]: I1204 13:55:20.442834 4760 generic.go:334] "Generic (PLEG): container finished" podID="091508c8-69c0-4b18-bfd6-677625366ada" containerID="229104418cbc63dd96da02b7962c84bba5c3f23220cac541ff693bc1c92d229d" exitCode=0 Dec 04 13:55:20 crc kubenswrapper[4760]: I1204 13:55:20.442946 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerDied","Data":"229104418cbc63dd96da02b7962c84bba5c3f23220cac541ff693bc1c92d229d"} Dec 04 13:55:21 crc kubenswrapper[4760]: I1204 13:55:21.457616 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerStarted","Data":"d4f9233ee93082a5a6264ab3098e9a1e00c3e04f8fb61d4d477f39cfab272da2"} Dec 04 13:55:21 crc kubenswrapper[4760]: I1204 13:55:21.489387 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sn2hl" podStartSLOduration=3.055367186 podStartE2EDuration="5.489362553s" podCreationTimestamp="2025-12-04 13:55:16 +0000 UTC" firstStartedPulling="2025-12-04 13:55:18.421024893 +0000 UTC m=+6121.462471460" lastFinishedPulling="2025-12-04 13:55:20.85502026 +0000 UTC m=+6123.896466827" observedRunningTime="2025-12-04 13:55:21.482327361 +0000 UTC m=+6124.523773938" watchObservedRunningTime="2025-12-04 13:55:21.489362553 +0000 UTC m=+6124.530809120" Dec 04 13:55:26 crc kubenswrapper[4760]: I1204 13:55:26.623537 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:26 crc kubenswrapper[4760]: I1204 13:55:26.624082 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:26 crc kubenswrapper[4760]: I1204 13:55:26.669991 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:27 crc kubenswrapper[4760]: I1204 13:55:27.568005 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:27 crc kubenswrapper[4760]: I1204 13:55:27.635396 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:29 crc kubenswrapper[4760]: I1204 13:55:29.536503 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sn2hl" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="registry-server" containerID="cri-o://d4f9233ee93082a5a6264ab3098e9a1e00c3e04f8fb61d4d477f39cfab272da2" gracePeriod=2 Dec 04 13:55:30 crc kubenswrapper[4760]: I1204 13:55:30.551483 4760 generic.go:334] "Generic (PLEG): container finished" podID="091508c8-69c0-4b18-bfd6-677625366ada" containerID="d4f9233ee93082a5a6264ab3098e9a1e00c3e04f8fb61d4d477f39cfab272da2" exitCode=0 Dec 04 13:55:30 crc kubenswrapper[4760]: I1204 13:55:30.551555 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerDied","Data":"d4f9233ee93082a5a6264ab3098e9a1e00c3e04f8fb61d4d477f39cfab272da2"} Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.253202 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.355838 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntd7f\" (UniqueName: \"kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f\") pod \"091508c8-69c0-4b18-bfd6-677625366ada\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.355898 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content\") pod \"091508c8-69c0-4b18-bfd6-677625366ada\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.355929 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities\") pod \"091508c8-69c0-4b18-bfd6-677625366ada\" (UID: \"091508c8-69c0-4b18-bfd6-677625366ada\") " Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.356983 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities" (OuterVolumeSpecName: "utilities") pod "091508c8-69c0-4b18-bfd6-677625366ada" (UID: "091508c8-69c0-4b18-bfd6-677625366ada"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.362365 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f" (OuterVolumeSpecName: "kube-api-access-ntd7f") pod "091508c8-69c0-4b18-bfd6-677625366ada" (UID: "091508c8-69c0-4b18-bfd6-677625366ada"). InnerVolumeSpecName "kube-api-access-ntd7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.408898 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "091508c8-69c0-4b18-bfd6-677625366ada" (UID: "091508c8-69c0-4b18-bfd6-677625366ada"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.458763 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntd7f\" (UniqueName: \"kubernetes.io/projected/091508c8-69c0-4b18-bfd6-677625366ada-kube-api-access-ntd7f\") on node \"crc\" DevicePath \"\"" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.458799 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.458809 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091508c8-69c0-4b18-bfd6-677625366ada-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.567343 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn2hl" event={"ID":"091508c8-69c0-4b18-bfd6-677625366ada","Type":"ContainerDied","Data":"8ed9cf95e671e664448ee4579ac7d112ce27a21453637ee58fbefae568a613cc"} Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.567425 4760 scope.go:117] "RemoveContainer" containerID="d4f9233ee93082a5a6264ab3098e9a1e00c3e04f8fb61d4d477f39cfab272da2" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.567666 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn2hl" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.598031 4760 scope.go:117] "RemoveContainer" containerID="229104418cbc63dd96da02b7962c84bba5c3f23220cac541ff693bc1c92d229d" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.610514 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.624893 4760 scope.go:117] "RemoveContainer" containerID="2df40bdfcf74b77c658819beddccb6da5809126ac9ecb63f835ac4c23d0e3f5f" Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.629645 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sn2hl"] Dec 04 13:55:31 crc kubenswrapper[4760]: I1204 13:55:31.875455 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091508c8-69c0-4b18-bfd6-677625366ada" path="/var/lib/kubelet/pods/091508c8-69c0-4b18-bfd6-677625366ada/volumes" Dec 04 13:56:54 crc kubenswrapper[4760]: I1204 13:56:54.622822 4760 generic.go:334] "Generic (PLEG): container finished" podID="79911128-6dfb-4fa9-b375-29ad707c556c" containerID="f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af" exitCode=0 Dec 04 13:56:54 crc kubenswrapper[4760]: I1204 13:56:54.622902 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvshw/must-gather-rjrw2" event={"ID":"79911128-6dfb-4fa9-b375-29ad707c556c","Type":"ContainerDied","Data":"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af"} Dec 04 13:56:54 crc kubenswrapper[4760]: I1204 13:56:54.625446 4760 scope.go:117] "RemoveContainer" containerID="f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af" Dec 04 13:56:54 crc kubenswrapper[4760]: I1204 13:56:54.746555 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvshw_must-gather-rjrw2_79911128-6dfb-4fa9-b375-29ad707c556c/gather/0.log" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.186717 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvshw/must-gather-rjrw2"] Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.187526 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kvshw/must-gather-rjrw2" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="copy" containerID="cri-o://c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f" gracePeriod=2 Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.199187 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvshw/must-gather-rjrw2"] Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.726347 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvshw_must-gather-rjrw2_79911128-6dfb-4fa9-b375-29ad707c556c/copy/0.log" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.727376 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.776185 4760 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvshw_must-gather-rjrw2_79911128-6dfb-4fa9-b375-29ad707c556c/copy/0.log" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.776899 4760 generic.go:334] "Generic (PLEG): container finished" podID="79911128-6dfb-4fa9-b375-29ad707c556c" containerID="c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f" exitCode=143 Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.776981 4760 scope.go:117] "RemoveContainer" containerID="c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.777070 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvshw/must-gather-rjrw2" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.798526 4760 scope.go:117] "RemoveContainer" containerID="f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.850486 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z9z5\" (UniqueName: \"kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5\") pod \"79911128-6dfb-4fa9-b375-29ad707c556c\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.850664 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output\") pod \"79911128-6dfb-4fa9-b375-29ad707c556c\" (UID: \"79911128-6dfb-4fa9-b375-29ad707c556c\") " Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.862625 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5" (OuterVolumeSpecName: "kube-api-access-2z9z5") pod "79911128-6dfb-4fa9-b375-29ad707c556c" (UID: "79911128-6dfb-4fa9-b375-29ad707c556c"). InnerVolumeSpecName "kube-api-access-2z9z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.877684 4760 scope.go:117] "RemoveContainer" containerID="c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f" Dec 04 13:57:06 crc kubenswrapper[4760]: E1204 13:57:06.878164 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f\": container with ID starting with c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f not found: ID does not exist" containerID="c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.878200 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f"} err="failed to get container status \"c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f\": rpc error: code = NotFound desc = could not find container \"c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f\": container with ID starting with c13be4b64174abd77b728358d15b2f77d8fb61840d6cd972eacba0ce585d492f not found: ID does not exist" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.878236 4760 scope.go:117] "RemoveContainer" containerID="f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af" Dec 04 13:57:06 crc kubenswrapper[4760]: E1204 13:57:06.878752 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af\": container with ID starting with f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af not found: ID does not exist" containerID="f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.878801 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af"} err="failed to get container status \"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af\": rpc error: code = NotFound desc = could not find container \"f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af\": container with ID starting with f3a0e195e715cf713cf16cd1de60fb3ba157d3a08a4c65d6e8441b40f20823af not found: ID does not exist" Dec 04 13:57:06 crc kubenswrapper[4760]: I1204 13:57:06.954294 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z9z5\" (UniqueName: \"kubernetes.io/projected/79911128-6dfb-4fa9-b375-29ad707c556c-kube-api-access-2z9z5\") on node \"crc\" DevicePath \"\"" Dec 04 13:57:07 crc kubenswrapper[4760]: I1204 13:57:07.057857 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "79911128-6dfb-4fa9-b375-29ad707c556c" (UID: "79911128-6dfb-4fa9-b375-29ad707c556c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:57:07 crc kubenswrapper[4760]: I1204 13:57:07.158741 4760 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79911128-6dfb-4fa9-b375-29ad707c556c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 13:57:07 crc kubenswrapper[4760]: I1204 13:57:07.878124 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" path="/var/lib/kubelet/pods/79911128-6dfb-4fa9-b375-29ad707c556c/volumes" Dec 04 13:57:33 crc kubenswrapper[4760]: I1204 13:57:33.380510 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:57:33 crc kubenswrapper[4760]: I1204 13:57:33.381044 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:58:02 crc kubenswrapper[4760]: I1204 13:58:02.675154 4760 scope.go:117] "RemoveContainer" containerID="a2c6f49e07ce79f11cb369d0bbca8217cedf34526579d5df5e425235707686e8" Dec 04 13:58:03 crc kubenswrapper[4760]: I1204 13:58:03.381136 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:58:03 crc kubenswrapper[4760]: I1204 13:58:03.381530 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.381259 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.382548 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.382644 4760 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.384196 4760 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef8b160a33e628bf2004e78ec3dece4bed2fad4d300aac3f9ee82eecf9cde264"} pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.384430 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" containerID="cri-o://ef8b160a33e628bf2004e78ec3dece4bed2fad4d300aac3f9ee82eecf9cde264" gracePeriod=600 Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.625343 4760 generic.go:334] "Generic (PLEG): container finished" podID="65f76314-9511-40ed-9ad6-2220378e7e97" containerID="ef8b160a33e628bf2004e78ec3dece4bed2fad4d300aac3f9ee82eecf9cde264" exitCode=0 Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.625419 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerDied","Data":"ef8b160a33e628bf2004e78ec3dece4bed2fad4d300aac3f9ee82eecf9cde264"} Dec 04 13:58:33 crc kubenswrapper[4760]: I1204 13:58:33.625732 4760 scope.go:117] "RemoveContainer" containerID="f6e0dab7729c2825b792cdab475acdab560f176c6029564a19ef3e6fa110df6f" Dec 04 13:58:34 crc kubenswrapper[4760]: I1204 13:58:34.640678 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" event={"ID":"65f76314-9511-40ed-9ad6-2220378e7e97","Type":"ContainerStarted","Data":"a6a4e2af2dbd11555b6faf2a5633b07ca2538422b07dc1ccf5e0719913cc8631"} Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.939333 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:07 crc kubenswrapper[4760]: E1204 13:59:07.946103 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="extract-content" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946145 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="extract-content" Dec 04 13:59:07 crc kubenswrapper[4760]: E1204 13:59:07.946194 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="registry-server" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946239 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="registry-server" Dec 04 13:59:07 crc kubenswrapper[4760]: E1204 13:59:07.946256 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="gather" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946264 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="gather" Dec 04 13:59:07 crc kubenswrapper[4760]: E1204 13:59:07.946282 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="copy" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946289 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="copy" Dec 04 13:59:07 crc kubenswrapper[4760]: E1204 13:59:07.946429 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="extract-utilities" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946441 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="extract-utilities" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946912 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="gather" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.946942 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="091508c8-69c0-4b18-bfd6-677625366ada" containerName="registry-server" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.947010 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="79911128-6dfb-4fa9-b375-29ad707c556c" containerName="copy" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.953415 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:07 crc kubenswrapper[4760]: I1204 13:59:07.954850 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.133070 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.136402 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.138109 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.138228 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5986\" (UniqueName: \"kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.138320 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.145065 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.240597 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.241002 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.241140 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.241282 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gzbk\" (UniqueName: \"kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.241375 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.241440 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5986\" (UniqueName: \"kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.242114 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.242135 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.278393 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5986\" (UniqueName: \"kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986\") pod \"redhat-marketplace-bqflx\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.324331 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.343257 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.343364 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gzbk\" (UniqueName: \"kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.343514 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.344137 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.344445 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.362873 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gzbk\" (UniqueName: \"kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk\") pod \"redhat-operators-ljggg\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.461046 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:08 crc kubenswrapper[4760]: I1204 13:59:08.939170 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:09 crc kubenswrapper[4760]: I1204 13:59:09.017519 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerStarted","Data":"57f71716e399453acd198347bd7b927216a1ea3cb70aefdbe457ba9d1f8dd20a"} Dec 04 13:59:09 crc kubenswrapper[4760]: W1204 13:59:09.101781 4760 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74b7c27_209b_4776_b82b_2149882610b6.slice/crio-d954b49de05ad9cc82b47472cc683310a6e249967bed373af594322076dc937b WatchSource:0}: Error finding container d954b49de05ad9cc82b47472cc683310a6e249967bed373af594322076dc937b: Status 404 returned error can't find the container with id d954b49de05ad9cc82b47472cc683310a6e249967bed373af594322076dc937b Dec 04 13:59:09 crc kubenswrapper[4760]: I1204 13:59:09.103506 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.026965 4760 generic.go:334] "Generic (PLEG): container finished" podID="b74b7c27-209b-4776-b82b-2149882610b6" containerID="2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b" exitCode=0 Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.027068 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerDied","Data":"2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b"} Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.027338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerStarted","Data":"d954b49de05ad9cc82b47472cc683310a6e249967bed373af594322076dc937b"} Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.028832 4760 generic.go:334] "Generic (PLEG): container finished" podID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerID="2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71" exitCode=0 Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.028872 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerDied","Data":"2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71"} Dec 04 13:59:10 crc kubenswrapper[4760]: I1204 13:59:10.030644 4760 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 13:59:11 crc kubenswrapper[4760]: I1204 13:59:11.042686 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerStarted","Data":"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7"} Dec 04 13:59:11 crc kubenswrapper[4760]: I1204 13:59:11.047769 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerStarted","Data":"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663"} Dec 04 13:59:12 crc kubenswrapper[4760]: I1204 13:59:12.058157 4760 generic.go:334] "Generic (PLEG): container finished" podID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerID="a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7" exitCode=0 Dec 04 13:59:12 crc kubenswrapper[4760]: I1204 13:59:12.058259 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerDied","Data":"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7"} Dec 04 13:59:12 crc kubenswrapper[4760]: I1204 13:59:12.062771 4760 generic.go:334] "Generic (PLEG): container finished" podID="b74b7c27-209b-4776-b82b-2149882610b6" containerID="98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663" exitCode=0 Dec 04 13:59:12 crc kubenswrapper[4760]: I1204 13:59:12.062809 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerDied","Data":"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663"} Dec 04 13:59:13 crc kubenswrapper[4760]: I1204 13:59:13.075300 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerStarted","Data":"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9"} Dec 04 13:59:13 crc kubenswrapper[4760]: I1204 13:59:13.078892 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerStarted","Data":"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2"} Dec 04 13:59:13 crc kubenswrapper[4760]: I1204 13:59:13.103863 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bqflx" podStartSLOduration=3.7112482460000003 podStartE2EDuration="6.103844851s" podCreationTimestamp="2025-12-04 13:59:07 +0000 UTC" firstStartedPulling="2025-12-04 13:59:10.030296135 +0000 UTC m=+6353.071742712" lastFinishedPulling="2025-12-04 13:59:12.42289275 +0000 UTC m=+6355.464339317" observedRunningTime="2025-12-04 13:59:13.099152633 +0000 UTC m=+6356.140599210" watchObservedRunningTime="2025-12-04 13:59:13.103844851 +0000 UTC m=+6356.145291418" Dec 04 13:59:13 crc kubenswrapper[4760]: I1204 13:59:13.157008 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ljggg" podStartSLOduration=2.716684588 podStartE2EDuration="5.156981105s" podCreationTimestamp="2025-12-04 13:59:08 +0000 UTC" firstStartedPulling="2025-12-04 13:59:10.030409269 +0000 UTC m=+6353.071855836" lastFinishedPulling="2025-12-04 13:59:12.470705786 +0000 UTC m=+6355.512152353" observedRunningTime="2025-12-04 13:59:13.12841569 +0000 UTC m=+6356.169862267" watchObservedRunningTime="2025-12-04 13:59:13.156981105 +0000 UTC m=+6356.198427672" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.324514 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.325114 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.374154 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.461986 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.462056 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:18 crc kubenswrapper[4760]: I1204 13:59:18.507966 4760 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:19 crc kubenswrapper[4760]: I1204 13:59:19.185920 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:19 crc kubenswrapper[4760]: I1204 13:59:19.198838 4760 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:20 crc kubenswrapper[4760]: I1204 13:59:20.124966 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.150427 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bqflx" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="registry-server" containerID="cri-o://3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9" gracePeriod=2 Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.532812 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.533388 4760 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ljggg" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="registry-server" containerID="cri-o://d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2" gracePeriod=2 Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.612840 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.741941 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content\") pod \"cbccae49-5654-4070-8c39-f8f6481caa1d\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.742100 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5986\" (UniqueName: \"kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986\") pod \"cbccae49-5654-4070-8c39-f8f6481caa1d\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.742308 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities\") pod \"cbccae49-5654-4070-8c39-f8f6481caa1d\" (UID: \"cbccae49-5654-4070-8c39-f8f6481caa1d\") " Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.744154 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities" (OuterVolumeSpecName: "utilities") pod "cbccae49-5654-4070-8c39-f8f6481caa1d" (UID: "cbccae49-5654-4070-8c39-f8f6481caa1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.750093 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986" (OuterVolumeSpecName: "kube-api-access-r5986") pod "cbccae49-5654-4070-8c39-f8f6481caa1d" (UID: "cbccae49-5654-4070-8c39-f8f6481caa1d"). InnerVolumeSpecName "kube-api-access-r5986". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.767080 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbccae49-5654-4070-8c39-f8f6481caa1d" (UID: "cbccae49-5654-4070-8c39-f8f6481caa1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.845151 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5986\" (UniqueName: \"kubernetes.io/projected/cbccae49-5654-4070-8c39-f8f6481caa1d-kube-api-access-r5986\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.845191 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.845201 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbccae49-5654-4070-8c39-f8f6481caa1d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:21 crc kubenswrapper[4760]: I1204 13:59:21.937537 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.048152 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities\") pod \"b74b7c27-209b-4776-b82b-2149882610b6\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.048574 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content\") pod \"b74b7c27-209b-4776-b82b-2149882610b6\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.048910 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gzbk\" (UniqueName: \"kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk\") pod \"b74b7c27-209b-4776-b82b-2149882610b6\" (UID: \"b74b7c27-209b-4776-b82b-2149882610b6\") " Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.049317 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities" (OuterVolumeSpecName: "utilities") pod "b74b7c27-209b-4776-b82b-2149882610b6" (UID: "b74b7c27-209b-4776-b82b-2149882610b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.049891 4760 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.053816 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk" (OuterVolumeSpecName: "kube-api-access-6gzbk") pod "b74b7c27-209b-4776-b82b-2149882610b6" (UID: "b74b7c27-209b-4776-b82b-2149882610b6"). InnerVolumeSpecName "kube-api-access-6gzbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.152083 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gzbk\" (UniqueName: \"kubernetes.io/projected/b74b7c27-209b-4776-b82b-2149882610b6-kube-api-access-6gzbk\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.163994 4760 generic.go:334] "Generic (PLEG): container finished" podID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerID="3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9" exitCode=0 Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.164076 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerDied","Data":"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9"} Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.164102 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bqflx" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.164128 4760 scope.go:117] "RemoveContainer" containerID="3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.164116 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bqflx" event={"ID":"cbccae49-5654-4070-8c39-f8f6481caa1d","Type":"ContainerDied","Data":"57f71716e399453acd198347bd7b927216a1ea3cb70aefdbe457ba9d1f8dd20a"} Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.165081 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b74b7c27-209b-4776-b82b-2149882610b6" (UID: "b74b7c27-209b-4776-b82b-2149882610b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.170691 4760 generic.go:334] "Generic (PLEG): container finished" podID="b74b7c27-209b-4776-b82b-2149882610b6" containerID="d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2" exitCode=0 Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.170829 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ljggg" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.170832 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerDied","Data":"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2"} Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.170946 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ljggg" event={"ID":"b74b7c27-209b-4776-b82b-2149882610b6","Type":"ContainerDied","Data":"d954b49de05ad9cc82b47472cc683310a6e249967bed373af594322076dc937b"} Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.199756 4760 scope.go:117] "RemoveContainer" containerID="a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.202624 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.223112 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bqflx"] Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.234851 4760 scope.go:117] "RemoveContainer" containerID="2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.239888 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.250138 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ljggg"] Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.254274 4760 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b7c27-209b-4776-b82b-2149882610b6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.274959 4760 scope.go:117] "RemoveContainer" containerID="3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.275516 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9\": container with ID starting with 3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9 not found: ID does not exist" containerID="3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.275565 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9"} err="failed to get container status \"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9\": rpc error: code = NotFound desc = could not find container \"3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9\": container with ID starting with 3e878a5b58fb09946aafe06e2f1dded96df18c3c66d3226b42cf7304740b4bc9 not found: ID does not exist" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.275599 4760 scope.go:117] "RemoveContainer" containerID="a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.275845 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7\": container with ID starting with a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7 not found: ID does not exist" containerID="a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.275867 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7"} err="failed to get container status \"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7\": rpc error: code = NotFound desc = could not find container \"a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7\": container with ID starting with a471ff0b8497c9d09740481672feb0b91c01727db4c7798e7a9c0ac1a5a1b0d7 not found: ID does not exist" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.275886 4760 scope.go:117] "RemoveContainer" containerID="2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.276110 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71\": container with ID starting with 2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71 not found: ID does not exist" containerID="2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.276125 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71"} err="failed to get container status \"2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71\": rpc error: code = NotFound desc = could not find container \"2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71\": container with ID starting with 2bda473bec71bfb089f2be25ade83df8e0ad59ab5c4fbeadd11e450f08100b71 not found: ID does not exist" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.276139 4760 scope.go:117] "RemoveContainer" containerID="d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.332741 4760 scope.go:117] "RemoveContainer" containerID="98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.359185 4760 scope.go:117] "RemoveContainer" containerID="2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.405037 4760 scope.go:117] "RemoveContainer" containerID="d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.405513 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2\": container with ID starting with d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2 not found: ID does not exist" containerID="d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.405554 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2"} err="failed to get container status \"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2\": rpc error: code = NotFound desc = could not find container \"d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2\": container with ID starting with d00559ab7b8e7af6528a5d5985672892437a546d915ccef77b500e8274d3ddc2 not found: ID does not exist" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.405585 4760 scope.go:117] "RemoveContainer" containerID="98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.406118 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663\": container with ID starting with 98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663 not found: ID does not exist" containerID="98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.406172 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663"} err="failed to get container status \"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663\": rpc error: code = NotFound desc = could not find container \"98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663\": container with ID starting with 98c8f235410377ed7225c1d4bc94a5583afa816ffd97437f405d79d40014c663 not found: ID does not exist" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.406227 4760 scope.go:117] "RemoveContainer" containerID="2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b" Dec 04 13:59:22 crc kubenswrapper[4760]: E1204 13:59:22.406577 4760 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b\": container with ID starting with 2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b not found: ID does not exist" containerID="2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b" Dec 04 13:59:22 crc kubenswrapper[4760]: I1204 13:59:22.406605 4760 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b"} err="failed to get container status \"2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b\": rpc error: code = NotFound desc = could not find container \"2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b\": container with ID starting with 2e69c68f901b2dd328373e0c4036023fac535287eb3dc9ea6df750be5854305b not found: ID does not exist" Dec 04 13:59:23 crc kubenswrapper[4760]: I1204 13:59:23.874618 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74b7c27-209b-4776-b82b-2149882610b6" path="/var/lib/kubelet/pods/b74b7c27-209b-4776-b82b-2149882610b6/volumes" Dec 04 13:59:23 crc kubenswrapper[4760]: I1204 13:59:23.875445 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" path="/var/lib/kubelet/pods/cbccae49-5654-4070-8c39-f8f6481caa1d/volumes" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.160328 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297"] Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161386 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="extract-utilities" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161401 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="extract-utilities" Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161413 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="extract-content" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161419 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="extract-content" Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161436 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="extract-content" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161443 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="extract-content" Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161456 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161462 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161487 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="extract-utilities" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161492 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="extract-utilities" Dec 04 14:00:00 crc kubenswrapper[4760]: E1204 14:00:00.161502 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161507 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161722 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbccae49-5654-4070-8c39-f8f6481caa1d" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.161733 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74b7c27-209b-4776-b82b-2149882610b6" containerName="registry-server" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.162615 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.164924 4760 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.165013 4760 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.171936 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297"] Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.300005 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.300115 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pf85\" (UniqueName: \"kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.300567 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.403048 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.403201 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.403252 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pf85\" (UniqueName: \"kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.404086 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.411530 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.424425 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pf85\" (UniqueName: \"kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85\") pod \"collect-profiles-29414280-9s297\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:00 crc kubenswrapper[4760]: I1204 14:00:00.496518 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:01 crc kubenswrapper[4760]: I1204 14:00:01.027590 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297"] Dec 04 14:00:01 crc kubenswrapper[4760]: I1204 14:00:01.551290 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" event={"ID":"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5","Type":"ContainerStarted","Data":"2fca14172b9eee3825d15490c3fe6f53795652dddfa162aae4f1acd8b42030dd"} Dec 04 14:00:01 crc kubenswrapper[4760]: I1204 14:00:01.551338 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" event={"ID":"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5","Type":"ContainerStarted","Data":"c746b7fb2fdc77943c173f5b3ae12f172fe9cf2a09ad30d96440a377305a80c0"} Dec 04 14:00:01 crc kubenswrapper[4760]: I1204 14:00:01.569530 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" podStartSLOduration=1.569508457 podStartE2EDuration="1.569508457s" podCreationTimestamp="2025-12-04 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 14:00:01.568701331 +0000 UTC m=+6404.610147898" watchObservedRunningTime="2025-12-04 14:00:01.569508457 +0000 UTC m=+6404.610955024" Dec 04 14:00:02 crc kubenswrapper[4760]: I1204 14:00:02.562902 4760 generic.go:334] "Generic (PLEG): container finished" podID="77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" containerID="2fca14172b9eee3825d15490c3fe6f53795652dddfa162aae4f1acd8b42030dd" exitCode=0 Dec 04 14:00:02 crc kubenswrapper[4760]: I1204 14:00:02.563192 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" event={"ID":"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5","Type":"ContainerDied","Data":"2fca14172b9eee3825d15490c3fe6f53795652dddfa162aae4f1acd8b42030dd"} Dec 04 14:00:03 crc kubenswrapper[4760]: I1204 14:00:03.961491 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.103247 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pf85\" (UniqueName: \"kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85\") pod \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.103308 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume\") pod \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.103384 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume\") pod \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\" (UID: \"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5\") " Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.103964 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume" (OuterVolumeSpecName: "config-volume") pod "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" (UID: "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.104574 4760 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.109057 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" (UID: "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.109093 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85" (OuterVolumeSpecName: "kube-api-access-4pf85") pod "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" (UID: "77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5"). InnerVolumeSpecName "kube-api-access-4pf85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.206400 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pf85\" (UniqueName: \"kubernetes.io/projected/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-kube-api-access-4pf85\") on node \"crc\" DevicePath \"\"" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.206443 4760 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.612270 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" event={"ID":"77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5","Type":"ContainerDied","Data":"c746b7fb2fdc77943c173f5b3ae12f172fe9cf2a09ad30d96440a377305a80c0"} Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.612325 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c746b7fb2fdc77943c173f5b3ae12f172fe9cf2a09ad30d96440a377305a80c0" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.612389 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414280-9s297" Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.670233 4760 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557"] Dec 04 14:00:04 crc kubenswrapper[4760]: I1204 14:00:04.679733 4760 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414235-gn557"] Dec 04 14:00:05 crc kubenswrapper[4760]: I1204 14:00:05.878030 4760 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e53000b-5944-4439-8991-8f8a2337fb31" path="/var/lib/kubelet/pods/0e53000b-5944-4439-8991-8f8a2337fb31/volumes" Dec 04 14:00:33 crc kubenswrapper[4760]: I1204 14:00:33.380498 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 14:00:33 crc kubenswrapper[4760]: I1204 14:00:33.381107 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.155799 4760 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414281-wkw98"] Dec 04 14:01:00 crc kubenswrapper[4760]: E1204 14:01:00.156921 4760 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" containerName="collect-profiles" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.157004 4760 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" containerName="collect-profiles" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.157422 4760 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ec5ed8-a35c-42cf-bea1-9029b2ea9dc5" containerName="collect-profiles" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.159198 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.172439 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414281-wkw98"] Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.187851 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.187900 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.187944 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.188011 4760 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntk7\" (UniqueName: \"kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.290522 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntk7\" (UniqueName: \"kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.290677 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.290703 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.290739 4760 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.299185 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.299450 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.299633 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.310403 4760 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntk7\" (UniqueName: \"kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7\") pod \"keystone-cron-29414281-wkw98\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.485072 4760 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:00 crc kubenswrapper[4760]: I1204 14:01:00.986870 4760 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414281-wkw98"] Dec 04 14:01:01 crc kubenswrapper[4760]: I1204 14:01:01.160125 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414281-wkw98" event={"ID":"1f736595-bf21-4227-90e8-46f2d6c8b239","Type":"ContainerStarted","Data":"22099d063254ed84d8e8a17e99073ac7d9d001a866a6d12426c3fea7c914071e"} Dec 04 14:01:02 crc kubenswrapper[4760]: I1204 14:01:02.171959 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414281-wkw98" event={"ID":"1f736595-bf21-4227-90e8-46f2d6c8b239","Type":"ContainerStarted","Data":"e9eef0faf861dfbf273646cd624525e81c69417b8437165413bcea8c488747b1"} Dec 04 14:01:02 crc kubenswrapper[4760]: I1204 14:01:02.200290 4760 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414281-wkw98" podStartSLOduration=2.200267176 podStartE2EDuration="2.200267176s" podCreationTimestamp="2025-12-04 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 14:01:02.193829442 +0000 UTC m=+6465.235276009" watchObservedRunningTime="2025-12-04 14:01:02.200267176 +0000 UTC m=+6465.241713743" Dec 04 14:01:02 crc kubenswrapper[4760]: I1204 14:01:02.892771 4760 scope.go:117] "RemoveContainer" containerID="ade4aef42f7999b409dac4b4e616a67c3460bd369e8d64ac818a00ad17eb25ad" Dec 04 14:01:03 crc kubenswrapper[4760]: I1204 14:01:03.380044 4760 patch_prober.go:28] interesting pod/machine-config-daemon-jnrr9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 14:01:03 crc kubenswrapper[4760]: I1204 14:01:03.380430 4760 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnrr9" podUID="65f76314-9511-40ed-9ad6-2220378e7e97" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 14:01:05 crc kubenswrapper[4760]: I1204 14:01:05.204616 4760 generic.go:334] "Generic (PLEG): container finished" podID="1f736595-bf21-4227-90e8-46f2d6c8b239" containerID="e9eef0faf861dfbf273646cd624525e81c69417b8437165413bcea8c488747b1" exitCode=0 Dec 04 14:01:05 crc kubenswrapper[4760]: I1204 14:01:05.204693 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414281-wkw98" event={"ID":"1f736595-bf21-4227-90e8-46f2d6c8b239","Type":"ContainerDied","Data":"e9eef0faf861dfbf273646cd624525e81c69417b8437165413bcea8c488747b1"} Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.575896 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414281-wkw98" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.731248 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle\") pod \"1f736595-bf21-4227-90e8-46f2d6c8b239\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.731542 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hntk7\" (UniqueName: \"kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7\") pod \"1f736595-bf21-4227-90e8-46f2d6c8b239\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.731608 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data\") pod \"1f736595-bf21-4227-90e8-46f2d6c8b239\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.731629 4760 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys\") pod \"1f736595-bf21-4227-90e8-46f2d6c8b239\" (UID: \"1f736595-bf21-4227-90e8-46f2d6c8b239\") " Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.741842 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1f736595-bf21-4227-90e8-46f2d6c8b239" (UID: "1f736595-bf21-4227-90e8-46f2d6c8b239"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.741884 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7" (OuterVolumeSpecName: "kube-api-access-hntk7") pod "1f736595-bf21-4227-90e8-46f2d6c8b239" (UID: "1f736595-bf21-4227-90e8-46f2d6c8b239"). InnerVolumeSpecName "kube-api-access-hntk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.766744 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f736595-bf21-4227-90e8-46f2d6c8b239" (UID: "1f736595-bf21-4227-90e8-46f2d6c8b239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.785960 4760 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data" (OuterVolumeSpecName: "config-data") pod "1f736595-bf21-4227-90e8-46f2d6c8b239" (UID: "1f736595-bf21-4227-90e8-46f2d6c8b239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.833414 4760 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.833444 4760 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.833453 4760 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f736595-bf21-4227-90e8-46f2d6c8b239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 14:01:06 crc kubenswrapper[4760]: I1204 14:01:06.833464 4760 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hntk7\" (UniqueName: \"kubernetes.io/projected/1f736595-bf21-4227-90e8-46f2d6c8b239-kube-api-access-hntk7\") on node \"crc\" DevicePath \"\"" Dec 04 14:01:07 crc kubenswrapper[4760]: I1204 14:01:07.227135 4760 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414281-wkw98" event={"ID":"1f736595-bf21-4227-90e8-46f2d6c8b239","Type":"ContainerDied","Data":"22099d063254ed84d8e8a17e99073ac7d9d001a866a6d12426c3fea7c914071e"} Dec 04 14:01:07 crc kubenswrapper[4760]: I1204 14:01:07.227177 4760 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22099d063254ed84d8e8a17e99073ac7d9d001a866a6d12426c3fea7c914071e" Dec 04 14:01:07 crc kubenswrapper[4760]: I1204 14:01:07.227282 4760 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414281-wkw98"